Dark patterns: is your website tricking people?

April 2022

Why should we be concerned about dark patterns?

Do you ever feel like a website or app has been designed to manipulate you into doing things you really don’t want to do? I bet we all have. Welcome to the murky world of ‘dark patterns’.

This term was originally penned in 2010 by user experience specialist, Harry Brignull, who defines dark patterns as “features of interface design crafted to trick people into doing things they might not want to do and which benefit the business”.

Whenever we use the internet, businesses are fighting for our attention and it’s often hard for them to get cut through. And we often don’t have the time or inclination to read the small print, we just want to achieve what we set out to do.

Business can take advantage of this. Sometimes they make it difficult to do things which should, on the face of it, be really simple. Like cancel or say no. They may try to lead you down a different path that suits their business better and leads to higher profits.

These practices are in the spotlight, and businesses could face more scrutiny in future. Sometimes dark patterns are deliberate, sometimes they may be accidental.

What are dark patterns?

There are many interpretations of dark patterns and many different examples. Here’s just a handful to give you a flavour – it’s by no means an exhaustive list.

  • ‘Roach Motel’ – this is where the user journey makes it easy to get into a situation, but hard to get out. (Perhaps ‘Hotel California’ might have been a better name?). For example, when it’s easy to sign up to a service, but very difficult to cancel it because it’s buried somewhere you wouldn’t think to look. And when you eventually find it, you still have to wade through several messages urging you not to cancel.
  • FOMO (Fear Of Missing Out) – this for example is when you’re hurried into making a purchase by ‘urgent’ messages showing a countdown clock or alert messages saying the offer will end imminently.
  • Overloading – this is when we’re confronted with a large number of requests, information, options or possibilities to prompt us to share more data. Or it could be used to prompt us to unintentionally allow our data to be handled in a way we’d never expect.
  • Skipping – this is where the design of the interface or user experience is done is such a way that we forget, or don’t think about, the data protection implications. Some cookie notices are designed this way.
  • Stirring – this affects the choices we make by appealing to our emotions or using visual cues. For example, using a certain colour for buttons you’d naturally click for routine actions – getting you into the habit of clicking on that colour. Then suddenly using that colour button for the paid for service, and making the free option you were after hard to spot.
  • Subliminal advertising – this is the use of images or sounds to influence our responses without us being consciously aware of it. This is banned in many countries as deceptive and unethical.

Social engineering?

Some argue the ‘big players’ in search and social media have been the worst culprits in the evolution and proliferation of dark patterns. For instance, the Netflix video ‘The Social Dilemma’ argued that Google and Facebook have teams of engineers mining behavioural data for insights on user psychology to help them evolve their interface and user experience.

The mountain of data harvested when we search, browse, like, comment, post and so on can be used against us, to drive us to behave they way they want us to – without us even realising. The rapid growth of AI could push this all to a whole new level if left unchecked.

The privacy challenge

Unsurprisingly there’s a massive cross-over between dark patterns and negatively impacting on a user’s privacy. The way user interfaces are designed can play a vital role in good or bad privacy.

In the EU, discussions about dark patterns (under the EU GDPR) tend to concentrate on to how dark patterns can be used to manipulate buyers to give their consent – and point out consent would be invalid if it’s achieved deceptively or not given freely.

Here are some specific privacy related examples.

  • Tricking you to installing an application you didn’t want, i.e. consent is not unambiguous or freely given.
  • When the default privacy settings are biased to push you in a certain direction. For example, on cookie notices where it’s much simpler to accept than object, and can take more clicks to object. Confusing language may also be used to manipulate behaviour.
  • ‘Privacy Zuckering’, is a term used for when you’re tricked into publicly sharing more information about yourself than you really intended to. Named after Facebook’s co-founder & CEO, but it isn’t unique to them, for example, LinkedIn have been fined for this.
  • When an email unsubscribe link is hidden within other text.
  • Where more screen space is given to selecting options the business wants you to take, and less space for what might be more preferable options the customer. For example, e.g. the frequency of a subscription, rather than a one-off purchase.

Should businesses avoid using dark patterns?

Many will argue YES! Data ethics is right at the heart of the debate. Businesses should ask themselves if what they are doing is fair and reasonable to try to encourage sales and if their practices could be seen as deceptive. Are they doing enough to protect their customers?

Here are just a few reasons to avoid using dark patterns:

  • They annoy your customers and damage their experience of your brand. A survey by Hubspot found 80% of respondents said they had stopped doing business with a company because of a poor customer experience. If your customers are dissatisfied, they can and will switch to another provider.
  • They could lead to higher website abandon rates.
  • Consent gathered by manipulating consumer behaviour is unlikely to meet the GDPR consent requirements, i.e. freely given, informed, explicit and unambiguous. So your processing could turn out to be unlawful.

Can these effects happen by mistake?

Dark patterns aren’t always deliberate. They can arise due to loss of focus, short-sightedness, poorly trained AI models, or a number of other factors. However they are more likely to occur when designers are put under pressure to deliver on time for a launch date, particularly when commercial objectives are prioritised above all else.

Cliff Kuang, author of “User Friendly”, says there’s a tendency to make it easy for users to perform the tasks that suit the company’s preferred outcomes. The controls for limiting functionality or privacy controls can sometimes be an afterthought.

What can businesses do to prevent this?

In practice it’s not easy to strike the right balance. We want to provide helpful information help our website / app users to make decisions. It’s likely we want to ‘nudge’ them in the ‘right’ direction. But we should be careful we don’t do this in ways which confuse, mislead or hurry users into doing things they don’t really want to do.

It’s not like the web is unique in this aim (it’s just that we have a ton of data to help us). In supermarkets, you used to always see sweets displayed beside the checkout. A captive queuing audience, and if it didn’t work on you, a clear ‘nudge’ to your kids! But a practice now largely frowned upon.

So how can we do ‘good sales’ online without using manipulation or coercion? It’s all about finding a healthy balance.

Here’s a few suggestions which might help your teams:

  • Train your product developers, designers & UX experts – not only in data protection but also in dark patterns and design ethics. In particular, help them recognise dark patterns and understand the negative impacts they can cause. Explain the principles of privacy by design and the conditions for valid consent.
  • Don’t allow business pressure and priorities to dictate over good ethics and privacy by design.
  • Remember data must always be collected and processed fairly and lawfully.

Can dark patterns be regulated?

The proliferation of dark patterns over recent years has largely been unrestricted by regulation.

In the UK & Europe, where UK & EU GDPR are in force, discussions about dark patterns have mostly gravitated around matters relating to consent – where that consent may have been gathered by manipulation and may not meet the required conditions.

In France, the CNIL (France’s data protection authority) has stressed the design of user interfaces is critical to help protect privacy. In 2019 CNIL took the view that consent gathered using dark patterns does not qualify as valid freely given consent.

Fast forward to 2022 and the European Data Protection Board (EDPB) has released guidelines; Dark patterns in social media platform interfaces: How to recognise and avoid them.

These guidelines offer examples, best practices and practical recommendations to designers and users of social media platforms, on how to assess and avoid dark patterns in social media interfaces which contravene GDPR requirements. The guidance also contains useful lessons for all websites and applications.

They remind us we should take into account the principles of fairness, transparency, data minimisation, accountability and purpose limitation, as well the requirements of data protection by design and by default.

We anticipate EU regulation of dark patterns may soon be coming our way. The International Association of Privacy Professionals (IAPP) recently said, “Privacy and data protection regulators and lawmakers are increasingly focusing their attention on the impacts of so-called ‘dark patterns’ in technical design on user choice, privacy, data protection and commerce.”

Moves to tackle dark patterns in the US

The US Federal Trade Commission has indicated it’s giving serious attention to business use of dark patterns and has issued a complaint against Age of Learning for its use of dark patterns for their ABC Mouse service.

Looking at state-led regulations, in California modifications to the CCPA have been proposed to tackle dark patterns. The Colorado Privacy Act also looks to address this topic.

What next?

It’s clear businesses should be mindful of dark patterns and consider taking an ethical stance to protect their customers / website users. Could your website development teams be intentionally or accidentally going too far? Its good practice to train website and development teams so they can prevent dark patterns occurring, intentionally or by mistake.

What does the IKEA CCTV story tell us?

April 2022

Only set up video surveillance if underpinned by data protection by design and default

What happened?

Following an internal investigation, IKEA was forced to apologise for placing CCTV cameras in the ceiling voids above the staff bathroom facilities in their Peterborough depot. The cameras were discovered and removed in September 2021, but the investigation has only just concluded in late March 2022.

An IKEA spokesman said:

 “Whilst the intention at the time was to ensure the health and safety of co-workers, we understand the fact that colleagues were filmed unknowingly in these circumstances will have caused real concern, and for this we are sincerely sorry.”

The cameras were installed following “serious concerns about the use of drugs onsite, which, owing to the nature of work carried out at the site, could have very serious consequences for the safety of our co-workers”.

They had been sanctioned following “multiple attempts to address serious concerns about drug use, and the use of false urine samples as a way of disguising it”.

“The cameras placed within the voids were positioned only to record irregular activity in the ceiling voids,” he said.

“They were not intended to, and did not, record footage in the toilet cubicles themselves. However, as aresult of ceiling tiles becoming dislodged, two cameras inadvertently recorded footage of the communal areas of two bathrooms for a period of time in 2017. The footage was not viewed at the time and was only recovered as part of these investigations.”

Apology and new ICO guidance

The key question raised by this incident is where to draw the line. When is it inappropriate to set up CCTV? In this instance, the company had concerns about drug misuse – but was that a good enough reason? I think a lot of us intuitively felt the answer was no. 

This apology conveniently coincides with the recent publication of some new guidance on video surveillance from ICO regarding UK GDPR and Data Protection Act 2018.

This guidance is not based on any changes in the legislation – more an update to provide greater clarity about what you should be considering.

Video surveillance definition

The ICO guidance includes all the following in a commercial setting:

  • Traditional CCTV
  • ANPR (automatic number plate recognition)
  • Body Worn Video (BWV)
  • Facial Recognition Technology (FRT)
  • Drones
  • Commercially available technologies such as smart doorbells and dashcams (not domestic settings)

Guidance for domestic use is slightly different.

Before setting up your video surveillance activity 

As part of the system setup, it’s important to create a record of the activities taking place. This should be included in the company RoPA (Record of Processing Activities).

As part of this exercise, one needs to identify:

  • the purpose of the lawful use of surveillance
  • the appropriate lawful basis for processing
  • the necessary and proportionate justification for any processing
  • identification of any data-sharing agreements
  • the retention periods for any personal data

 As with any activity relating to the processing of personal data, the organisation should take a data protection by design and default approach when setting up the surveillance system.

Before installing anything, you should also carry out a DPIA (Data Protection Impact Assessment) for any processing that’s likely to result in a high risk for individuals. This includes:

  • Processing special category data
  • Monitoring publicly accessible places on a large scale
  • Monitoring individuals at a workplace

A DPIA means you can identify any key risks as well as potential mitigation for managing these. You should assess whether the surveillance is appropriate in the circumstances.

In an employee context it’s important to consult with the workforce, consider their reasonable expectations and the potential impact on their rights and freedoms. One could speculate that IKEA may not have gone through that exercise.

Introducing video surveillance

Once the risk assessment and RoPA are completed, other areas of consideration include:

  • Surveillance material should be securely stored – need to prevent unauthorised access
  • Any data which can be transmitted wirelessly or over the internet requires encryption to prevent interceptions
  • How easily data can be exported to fulfil DSARs
  • Ensuring adequate signage is in place to define the scope of what’s captured and used.

Additional considerations for Body Worn Video  

  • It’s more intrusive than CCTV so the privacy concerns are greater
  • Whether the data is stored centrally or on individual devices
  • What user access controls are required
  • Establishing device usage logs
  • Whether you want to have the continuous or intermittent recording
  • Whether audio and video should be treated as two separate feeds

In any instance where video surveillance is in use, it’s paramount individuals are aware of the activity and understand how that data is being used.

Ransomware attack leads to £98k ICO fine

March 2022

Solicitors firm failed to implement ‘adequate technical and organisational measures’

Are you using Multi-Factor Authentication? Are patch updates installed promptly? Do you encrypt sensitive data?

Reports of cyber security incidents in the UK rose 20% in the last 6 months of 2021.

These figures from the ICO, combined with the heightened threat in the current climate, provide a stark warning to be alert.

The ICO says; “The attacks are becoming increasingly damaging and this trend is likely to continue. Malicious and criminal actors are finding new ways to pressure organisations to pay.”

Against this backdrop the ICO has issued a fine to Solicitors’ firm following a ransomware attack in 2020.

The organisation affected was Tuckers Solicitors LLP (“Tuckers”) which is described on its website as the UK’s leading criminal defence lawyers, specialising in criminal law, civil liberties and regulatory proceedings.

While each organisation will face varying risks, this case highlights some important points for us all.

Here’s a summary of what happened, the key findings and the steps we can all take. For increasing numbers of organisations this case will unfortunately sound all too familiar.

What happened?

On 24 August 2020 Tuckers realised parts of its IT system had become unavailable. Shortly after IT discovered a ransomware note.

  • Within 24 hours it was established the incident was a personal data breach and it was reported to the ICO.
  • The attacker, once inside Tuckers’ network, installed various tools which allowed for the creation of a user account. This account was used to encrypt a significant volume of data on an archive server within the network.
  • The attack led to the encryption of more than 900,000 files of which over 24,000 related to ‘court bundles’.
  • 60 of these bundles were exfiltrated by the attacker and released on the ‘dark web’. These compromised files included both personal data and special category data.
  • The attacker’s actions impacted on the archive server and backups. Processing on other services and systems were not affected.
  • By 7 September 2020, Tuckers updated the ICO to say the servers had been moved to a new environment and the business was operating as normal. The compromised data was effectively permanently lost, however material was still available in management system unaffected by the attack.
  • Tuckers notified all but seven of the parties identifiable within the 60 court bundles which had been released, who they did not have contact details for.

Neither Tuckers, nor third party investigators, were able to determine conclusively how the attacker was able to access the network in the first place. However, evidence was found of a known system vulnerability which could have been used to either access the network or further exploit areas of Tuckers once in side the network.

What data was exfiltrated?

The data released on the ‘dark web’ included:

  • Basic identifiers
  • Health data
  • Economic and financial data
  • Criminal convictions
  • Data revealing racial or ethnic origin

This included medical files, witness statements and alleged crimes. It also related to ongoing criminal court and civil proceedings.

Tuckers explained to the Regulator, based on its understanding, the personal data breach had not had any impact on the conduct or outcome of relevant proceedings.

However, the highly sensitive nature of the data involved increased the risk and potential adverse impact on those affected.

Four key takeaways

The ICO makes it clear in its enforcement notice that primary culpability for the incident rests with the attacker. But clear infringements by Tuckers were found.

The Regulator says a lack of sufficient technical and organisation measures gave the attacker a weakness to exploit.

Takeaways from this case:

1) Multi-Factor Authentication (MFA)

Tuckers’ GDPR and Data Protection Policy required two-factor authentication, where available. It was found that Multi-Factor Authentication (MFA) was not used for its ‘remote access solution’.

The ICO says the use of MFA is a relatively low-cost preventative measure which Tuckers should have implemented.

The Regulator concluded the lack of MFA created a substantial risk of personal data on Tuckers’ systems being exposed to consequences such as this attack.

Takeaway: If you currently don’t use MFA, now would be a good time to implement it.

2) Patch management

The case reveals a high-risk security patch was installed in June 2020, more than FOUR months after its release.

The ICO accepts the attacker could have exploited this vulnerability during the un-patched period.

Considering the highly sensitive nature of the personal data Tuckers were handling, the Regulator concludes they should not have been doing so in an infrastructure containing known critical vulnerabilities. In other words the patch should have been installed much sooner.

Takeaway: Make sure patches are installed promptly, especially where data is sensitive.

3) Encryption

During the investigation Tuckers informed the ICO the firm had not used encryption to protect data on the affected archived server.

While the Regulator accepts this may not have prevented the ransomware attack itself, it believes it would have mitigated some of the risks posed to the affected individuals.

Takeaway: There are free, open-source encryption solutions are available. Alternatively more sophisticated paid for solutions are available for those handling more sensitive data.

Also it’s worth checking you’re adequately protecting archives to the same standard as other systems.

4) Retention

The enforcement notice reveals some ‘court bundles’ affected in the attack were being stored beyond the set 7-year retention period.

Takeaway: This again exposes a common issue for many organisations. Too often data is held longer than is necessary, which can increase the scale & impact of a data breach.

Our comprehensive Data Retention Guidance is packed with useful tools, templates and advice on tackling how long you keep personal data for.

What else can organisations do?

Clearly, we can’t be complacent and shouldn’t cut corners. We need to take all appropriate steps to protect personal data and avoid common pitfalls. Here are some useful resources to help you:

  • Cyber Essentials – The enforcement action notes that prior to the attack Tuckers was aware its security was not at the level of the NCSC Cyber Essentials. In October 2019, it was assessed against the ‘Cyber Essentials’ criteria and failed to meet crucial aspects of its requirements.

Cyber Essentials was launched in 2014 and is an information security assurance scheme operated by the National Cyber Security Centre. It helps to make sure you have the basis controls in place to protect networks/systems from threats.

Cyber Essentials – gain peace of mind with your information security
National Cyber Security Centre

  • ICO Ransomware guidance – The ICO has recently published guidance which covers security policies, access controls, vulnerability management, detection capabilities and much more.
  • DPN Data Breach Guide – Our practical guide covers how to be prepared, how to assess the risk and how to decide whether a breach should be reported or not.

You can read the full details of this case here: ICO Enforcement Action – Tuckers Solicitors LLP

Data Breach Guide

How to handle a data breach

Our practical, easy-to-read guide takes you through how to be prepared for a breach, and how to assess the risks should you suffer a personal data breach.

Data breach guide from the data protection consultancy DPN - Data Protection Network

This data breach guide covers:

  • Common causes of breaches
  • Data incident and breach planning
  • How to assess the risks
  • Breach reporting checklists
  • How technology can help

Managing data transfers from the UK

February 2022

The new International Data Transfer Agreement (IDTA) and Addendum is a sensible evolution of the old SCCs

International Data Transfers – to recap

Whenever UK-based organisations arrange the transfer of personal data to a third country outside the UK, they need to make sure the transfers are lawful, by confirming the data security and rights of individuals remain protected when data leaves the country.

Since the famous “Schrems II” ruling by the European Court of Justice in 2020, this activity has been thrown into disarray. To remind you, this is the ruling which invalidated the EU-US Privacy Shield and raised concerns about the use of EU Standard Contractual Clauses (SCCs) to protect the data. 

Soon after, the European Commission set to work to update the EU SCCs. These were drafted and enacted fairly swiftly taking effect on 27th June 2021. 

What are the new EU SCCs?

The new EU SCCs were expanded to introduce more flexible scenarios: 

  • SCCs are now modular meaning that they can accommodate different scenarios, where you can pick the parts which relate to your particular situation.
  • The SCCs cover four different transfer scenarios, including processors:
    • Controller to controller
    • Controller to processor
    • Processor to controller
    • Processor to processor
  • More than two parties can accede to the SCCs, meaning additional controllers and processors can be added through the lifetime of the contract. This potentially reduces the administrative burden.

How did this affect the UK? 

On 28th June the UK’s adequacy decision was adopted.  On September 27th 2021, the prior version of the SCCs expired. 

In our webinar last year, it was obvious that everyone was confused. The situation caused by the “Schrems” ruling was compounded by the fact that Brexit had been completed. This meant we could no longer apply the SCCs approved in Europe. The UK needed its own SCCs, but they did not exist. 

The ICO consultation

From August to October 2021, the ICO conducted a consultation to understand how a UK version of these rules should be enacted. Since we had been granted an adequacy agreement by the EU, we all hoped it would be possible to mirror the SCCs arrangements in UK law thus re-instating the means by which we can lawfully export data to places such as the US. 

Anecdotally the resounding view was not to mess with the principles enshrined in the EU SCCs as it would simply add complexity to an already complex situation.

The ICO conclusion

In January, the ICO published the International Data Transfer Agreement (IDTA) and the International Data Transfer Addendum to the EU Commission Standard Contractual Clauses. To the layperson, the EU’s standards have been adopted. 

What’s included in the Agreement and Addendum? 

    1. The International Data Transfer Agreement (IDTA) replaces the old EU SCCs which were relied upon to provide the appropriate safeguards required under the UK GDPR for international data transfers from the UK. There are differences to the new EU SCCs – it is a single all-encompassing agreement that incorporates all the scenarios identified in EU SCCs. One can omit sections and there is no requirement for it to be signed. This is most useful for those creating new data transfer agreements.
    2. The UK Addendum is a far simpler document. It is an addendum to the EU SCCs where references to EU laws are replaced by references to UK laws. It allows businesses to use the EU SCCs for international data transfers from the EU but also from the UK. These are useful for those already using the EU SCCs who want a simple addendum to update the legal context. 

When does this come into force?

The IDTA was laid before Parliament on 2nd February 2022. It comes into force on 21st March if there are no objections. To all intents and purposes, it’s in force now. The Information Commissioner Office (ICO) has stated the IDTA and UK Addendum:

“are immediately of use to organisations transferring personal data outside of the UK, subject to the caveat that they come into force on 21 March 2022 and are awaiting Parliamentary approval“.

What does this all mean?

In practice, UK businesses can breathe a sigh of relief and get on with their lives. There is clarity at last. Existing agreements need to be updated with the UK Addendum and new ones can be put in place with the International Data Transfer Agreement. There will be an administrative burden, but businesses now know what they need to do.  Good sense has prevailed. 

 

Managing Erasure Requests or DSARs via Third-Party Portals

January 2022

Do organisations have to honour them? Well, it depends…

Over the past few years GDPR, the California Consumer Privacy Act (CCPA) and other privacy regulations have led to specialist companies offering to submit Erasure or Data Subject Access Requests (DSARs) on behalf of consumers.

These online portals say they want to help people exercise their privacy rights, while enabling them to make requests to multiple organisations simultaneously.

Companies on the receiving end of such requests often receive them in volume, and not necessarily from consumers they even know. Requests can quote swathes of legislation, some of which may be relevant, some which won’t apply in your jurisdiction.

If you haven’t had any yet, you may soon. Companies like Mine, Privacy Bee, Delete Me, Revoke and Rightly all offer these services.

They don’t all operate in the same way, so be warned the devil is in the detail.

How third-party portals work

Okay, bear with me, as said there are different approaches. They may use one, or a combination of, the following elements:

  • Offer to simply submit requests on the individual’s behalf, then the consumer engages directly with each organisation
  • Offer people the opportunity to upload their details and proof of ID, so the portal can submit requests on their behalf without the consumer needing to validate their ID each time.
  • Provide a bespoke link which organisations are invited to use to verify ID/authority. (Hmmm, we’re told not to click on links to unknown third parties, right?)
  • Allow consumers to select specific named organisations to submit requests too
  • Make suggestions for which organisations the individual might wish to ‘target’
  • Offer to scan the individual’s email in-box to then make suggestions about which organisations are likely to hold their personal data. (Again, really? Would you knowingly let any third-party scan your in-box?).

Is this a good thing? Does it empower the consumer?

On the surface, this all seems fairly positive for consumers, making it simpler and quicker to exercise their privacy rights.

For organisations, these portals could be seen as providing an easier way of dealing with rights requests in one place. Providing perhaps, a more secure way of sharing personal data, for example in responding to a DSAR.

I would, however, urge anyone using these portals to read the small print, and any organisation in receipt of these requests to do their homework.

Why it’s not all straight-forward

The following tale from one DPO may sound familiar…

We tend to find these requests slightly frustrating and time-consuming. First, we have to log all requests for our audit trails. We cannot simply ignore the requests otherwise this can cause regulatory issues, not to mention if they are genuine requests.

More often than not, they are sent in batches and do not contain the information we require to search and make the correct suppression. Where we do have enough information to conduct searches, we often find the personal details do not exist on our database.

Another concern is whether the requests are actually for meant for us. We recently received a number of requests for a competitor, who was clearly named on the requests. When we tried to contact the portal to explain this issue, we did not get a response and were essentially ignored, which leaves us in a predicament – do we continue with the with the request, was it actually for our organisation or not?

So, there’s a problem. Requests might be submitted on behalf of consumers who organisations have never have engaged with. Requests can arrive with insufficient information. We can’t always verify people’s identity, or the portal’s authority to act on their behalf. In these circumstances, do people genuinely want us to fulfil their Erasure or Access request?

What does the ICO say about third-party portals?

The regulator does reference online portals in is Right of Access guidance. It tells us we should consider the following:

  • Can you verify the identity of the individual?
  • Are you satisfied the third-party has authority to act on their behalf?
  • Can you view the request without having to take proactive steps (e.g. paying a fee or signing up to a service)?

The ICO makes it clear it would not expect organisations to be obliged to take proactive steps to discover whether a DSAR has been made. Nor are you obliged to respond if you’re asked to pay a fee or sign up to a service.

The Regulator says it’s the portal’s responsibility to provide evidence of their authority to act on an individual’s behalf. If we have any concerns, we’re told to contact the individual directly.

If we can’t contact the individual, the guidance tells us we should contact the portal and advise them we will not respond to the request until we have the necessary information and authorisation.

This all takes time…

This is all very well, but for some organisations receiving multiple requests this is incredibly time-consuming.  Some organisations are receiving hundreds of these requests in a single hit, as Chris Field from Harte Hanks explains in – You’ve been SAR-bombed.

In addition, we need to do our research and understand how the portal operates, checking whether we believe they’re bone fide or not.

Another DPO, whose company receives around thirty privacy requests from third-party portals a month says; “Often these tools don’t provide anything more than very scanty info, so they all require responses and requests for more info”. This company takes the following approach; “We deal with the individual if it’s a legitimate contact detail, or we don’t engage.”

It really is a question of how much effort is reasonable and proportionate.

We must respect fundamental privacy rights, understand third-party portals may be trying to support this, but balance this with our duty to safeguard against fraud or mistakes.

Are Data Subject Access Requests driving you crazy?

January 2022

Complicated. Costly. Time-consuming...

… And driving me crazy. We’ve all heard the dreaded words, right? I’d like a copy of my personal data.

Which led me to think; is the fundamental privacy right of accessing our personal data becoming part of our increasingly litigious culture? The DSAR is now a staple opening shot for law firms handling grievance claims or employment tribunals, looking for potentially incriminating morsels of information.

Of course, this right must be upheld, but is the process fit for purpose? Employee-related requests, in particular, can entail a massive amount of work and the potential for litigation makes them a risky and complex area.

For some organisations, this is water off a duck’s back; they’ve always had access requests, anticipated volume would increase after GDPR, have teams to handle them, invested in tech solutions, have access to lawyers and so on.

Great stuff, but please spare a thought for others.

Plenty of businesses have lower volumes of DSARs. They’re unable to justify, or afford, extra resources. These guys are struggling under a system that assumes one size fits all.

Then there are businesses who’ve never even had a DSAR. For them, just one request can be an administrative hand grenade.

Of course some businesses are guilty of treating employees badly, but I wish things could be different. It’s about getting the balance right, that most elusive of things when creating regulatory regimes. Are the principles behind the DSAR important? Of course. Can the processes be improved? Definitely!

So be warned – here begins a micro-rant on behalf of the smaller guys. I’m feeling their pain.

What’s that sound? It’s wailing and the gnashing of teeth

It’s clear in our Privacy Pulse Report DSARs are a significant challenge facing data protection professionals. One DPO told us;

“Vexatious requests can be very onerous. Controllers need broader scope for rejection and to refine down the scope, plus criteria for when they can charge… In my view, the ICO should focus on helping controllers to manage complex and vexatious DSARs.”

Some access requests are straightforward, especially routine requests where ‘normal’ procedures apply. However, some requests are made by angry customers or disgruntled ex-employees on a mission… and there’s no pleasing them. A troublesome minority appear to be submitting DSARs because they want to cause inconvenience because they’re angry, but don’t go so far as to fall under the ‘manifestly unfounded’ exemption.

Anyhow, for all those of you out there dealing with this stuff, know that I feel your pain. Without any further ado…

My THREE biggest DSAR bugbears (there are others)

Everything!

We’re entitled to a copy of ALL our personal data (to be clear, this doesn’t mean we’re entitled to full documents just because our name happens to appear on them somewhere).

It’s true organisations are allowed to ask for clarification, and the ICO’s Right of Access Guidance, provides some pointers on how to go about this.

Yet that tiny glimmer of hope is soon dashed – we’re told we shouldn’t seek clarification on a blanket basis. We should only seek it if it’s genuinely required AND we process a large amount of information about the individual.

Furthermore; “you cannot force an individual to narrow the scope of their request, as they are still entitled to ask for ‘all the information you hold’ about them.”

Why?

Let’s take the hypothetical (but realistic) case of an ex-employee who believes they’ve been unfairly dismissed. They worked for the company for 10 years, they submit a DSAR but choose not to play along with clarifying their request. They want everything over a decade of employment.

Do they really need this information? Or are they refusing to clarify on purpose? Is this a fair, proportionate ‘discovery process’? As I’ve said before, large organisations may be better placed absorb this, it’s the not-so-big ones who can really feel the pain. And in my experience, much personal data retrieved after hours of painstaking work isn’t relevant or significant at all.

Emails!

I get conflicted with the requirement to search for personal data within email communications and other messaging systems.

On the one hand we have the ICO’s guidance, which to summarise tells us:

  • personal data contained within emails is in scope (albeit I believe GDPR has been interpreted differently by other countries on this point);
  • you don’t have to provide every single email, just because someone’s name and email address appears on it;
  • context is important and we need to provide emails where the content relates to the individual (redacted as necessary).

If you don’t have a handy tech solution, this means trying to develop reasonable processes for retrieving emails, then eliminating those which won’t (or are highly unlikely) to have personal data within the content. This takes a lot of time.

Why am I conflicted? In running a search of your email systems for a person’s name and email address, you’ll inevitably retrieve a lot of personal data relating to others.

They might have written emails about sensitive or confidential matters, now caught within the retrieval process. Such content may then be reviewed by the people tasked with handling the request.

I suspect this process can negatively impact on wider employee privacy. Yes, we’re able to redact third party details, but by searching the emails in the first place, we’re delving into swathes of lots of people’s personal data.

It seems everyone else’s right to privacy is thrown out in the interests of fulfilling one person’s DSAR.

It also makes me wonder; if I write a comment that might be considered disparaging about someone in an email, do I have any right to this remaining private between me and the person I sent it to? (Even if it wasn’t marked confidential or done via official procedure).

I know many DPOs warn their staff not to write anything down, as it could form part of a DSAR. I know others who believe they’re justified in not disclosing personal data about the requester, if found in other people’s communications. Which approach is right?

Time!

Who decided it was a good idea to say DSARs had to be fulfilled within ‘one calendar month’?

It wasn’t! This phrase led to the ICO having to offer this ‘clarification’;

You should calculate the time limit from the day you receive the request, fee or other requested information (whether it is a working day or not) until the corresponding calendar date in the next month.

If this is not possible because the following month is shorter (and there is no corresponding calendar date), the date for response is the last day of the following month.

If the corresponding date falls on a weekend or a public holiday, you have until the next working day to respond.

This means that the exact number of days you have to comply with a request varies, depending on the month in which an individual makes the request.

For practical purposes, if a consistent number of days is required (e.g. for operational or system purposes), it may be helpful to adopt a 28-day period to ensure compliance is always within a calendar month.

I hope you got that.

Wouldn’t it have been easier to have a set number of days? And perhaps more realistic timescale?

Let’s take the hypothetical (but realistic) case; you receive a DSAR on 2nd December. You can’t justify an extension as it isn’t unduly complex.

Yes, I know you’re with me; bank holidays and staff leave suddenly means the deadline is horribly tight.

I wish there was specific number of days to respond. I wish they excluded national bank holidays and I wish there was a reprieve for religious festivals. I know, I’m dreaming.

DSARs and UK data reform

Is the UK Government going to try and address the challenges in their proposal to reform UK data protection law?

The consultation paper makes the right noises about the burden DSARs place on organisations, especially smaller businesses.

Suggestions include introducing a fee regime, similar to that within the Freedom of Information Act. One idea is a cost ceiling, while the threshold for responding could be amended. None of this is without challenges. There’s also a proposal to re-introduce a nominal fee.

On the latter point, GDPR removed the ability to charge a fee. You may recall prior to 2018 organisations could charge individuals £10 for a copy of their personal data.

Many will disagree, but I think the nominal fee is reasonable. I realise it could be seen a barrier to people on lower incomes exercising a fundamental right. However, my thoughts are organisations wouldn’t be forced to charge. It would be their choice. They would also be able to use their discretion by waiving the fee in certain situations.  It makes people stop and think; ‘do I really want this?’

Whatever transpires, I truly hope some practical changes can be made to support small and medium-sized businesses. Balancing those with individual rights isn’t easy, but that’s why our legislators are paid the big bucks.

And here, dear reader, endeth my rant!

Data Protection Officers – what does it take to do the job?

January 2022

The unique blend of traits and skills which make for a great DPO

What is it that makes a DPO effective and successful? Whether you’re recruiting or someone interested in the role, here are a few thoughts for you to chew over. I’m focussing here more on character traits, rather than the specialist knowledge & skills required for the job.

Be a good leader – not just a manager

A DPO should be a self-starter, with the energy and motivation to lead and inspire others. With the leadership skills to set the direction of travel for data protection across the organisation, laying out clear priorities and bringing others with them on the journey.

In the words of Mark Starmer; ‘Will the real leader please stand up?’, leadership is all about being able to influence. This means building effective relationships with everyone from senior management, clients, customers and so on. All this helps the DPO with their quest to embed data protection principles and processes across the organisation.

If they have direct reports, they’ll need to be someone who can lead and inspire their team. This includes recognising people’s individual strengths and weaknesses, their progress and achievements. Finding appropriate and perhaps innovative ways to recognise and reward each individual.

Thirst for knowledge

Not only does a DPO need to have an excellent grasp of the relevant laws, and ideally qualifications to evidence this, but they also need to be someone who is always on a quest to learn more. Someone who is happy to spend their spare time reading new guidance, privacy articles and opinions, case law and so on. Someone with a genuine interest in the data landscape and emerging trends.

Autonomy and independence

A DPO must also be able to act autonomously, independently and objectively, as the role requires. Not only looking at what the law requires, but also considering ethical and moral issues, to work out what is the right thing to do. Acting with genuine honesty and integrity.

Robert Bond, Senior Legal Counsel at Bristows:

“Data Protection Officers must be adept and be able to adapt and adopt as circumstances require. Above all they need to implement compliance & ethics with impartiality.”

A great communicator and diplomat

Strong communication skills are vital. Taking the time to actively listen, interpret and understand others.

A DPO is likely to work with a range of staff across the organisation, plus clients and suppliers. Often working across national borders too. This requires cultural awareness and sensitivity. They need to be able to change their approach, depending on who they are talking to.

As Fedelma Good, Director at PwC UK explains:

‘DPOs need to be great communicators and above all they need to be multi-lingual. They need to be able to communicate across a broad range of stakeholders, ranging from board members to web designers and quite often they need to act as the translator to ensure that technical, legal and business specialists really do all understand each other.’

Sympathetic but strong

A good DPO will be both understanding and assertive. There’ll be times when people are tricky to handle, be it disgruntled customers or even perhaps a member of the senior management team!

The role doesn’t exist to preserve the status quo. They may need to push back against established practices (‘we’ve always done it that way’) and challenge people to think differently and find creative solutions. This takes sheer persistence and the drive to make a difference.

Confidence

A DPO should to be a confident individual who is up for some straight-talking when needed. They must be ready to stand their ground. But they also need the confidence to show humility and say when they don’t know the answer. The laws are detailed and complex and no DPO can know it all.

To apply the law in practice, they often need time to think it through and deliberate. DPOs need to be clear when they need this time and need to resist the temptation (or demands) to respond immediately.

Well-organised

Sometimes everyone seems to be clamouring for a piece of the DPO. Juggling multiple conflicting priorities, means being well-organised is critical. Some demands will be urgent, others important but less urgent, some can wait. That data breach always seems to happen on a Friday afternoon!

A DPO will inevitably need to do their fair share of ‘fire-fighting’ when things crop up out of the blue. They need to manage not only their diary, but colleagues’ expectations too!

Even at the busiest times, it’s also important to try and remain approachable with an ‘open door’ to anyone in the organisation.

Finding workable solutions

Because of the specialist knowledge and obligations a DPO has, they need to work hard to show the business how their role acts as an enabler for the business. Nobody wants to be seen as ‘the department of No’.

In my view this often comes back to character and communication style – being ready not only to shine a light on compliance risks but also to go the extra mile, working closely with stakeholders to find pragmatic solutions.

Taking a more flexible solution-oriented approach builds much better relationships, where the rest of the business sees the DPO as someone who doesn’t put up barriers, but will help them navigate their way to reach their goals.

This is especially important during times of change. Someone who can embrace change, stay positive and focussed and keep working towards shared goals is more likely to succeed in the end.

In conclusion

Wow, the DPO role is certainly a demanding role which requires a lot of positive character traits and interpersonal skills!

All nicely summed up by Matt Kay, Deputy DPO at Metro Bank:

“It goes without saying that the role of a DPO is multi-faceted requiring a broad skillset with organisations valuing certain skills more than others, and this of course differs between organisations. For me I think the key skills are stakeholder engagement, the ability to project manage, navigate conflicting priorities and being able to take a pragmatic approach. Taking risk based decisions that balance the needs of data subjects and the organisation you work for.”