Data breaches – human or a catalogue of errors?
Why systems fail
The recent spate of serious data breaches, not least the awful case involving the Police Service of Northern Ireland (PSNI), left me wondering: who’s really to blame? We’re used to hearing about human error, but is it too easy to point the finger?
Is it really the fault of the person who pressed the send button? An old adage comes to mind, ‘success has a thousand fathers, failure is an orphan.’
Of course, people make mistakes. Training, technology and procedures can easily fail if ignored, either wilfully or otherwise. Yes, people are part of the equation. But that’s what it is. An equation. There are usually other factors at play.
In the PSNI case – one involving safety-critical data – I would argue there’s a strong argument that any system allowing such unredacted material to enter an FOIA environment in the first place is flawed?
Nobody is immune from human error. About nine years ago, on my second day in a new compliance role, I left my rucksack on the train. Doh! Luckily, there was no personal data relating to my new employer inside. I lost my workplace starter pack and had to cancel my debit card. I recall the sinking feeling as my new boss said, ‘well, that’s a bit embarrassing for someone in your job’. It was. But I knew it could have been so much worse.
Approximately 80% of data breaches are classified by the Information Commissioner’s Office as being caused by human error. Common mistakes include:
- Email containing personal data sent to the wrong recipients
- Forwarding attachments containing personal data in error
- Failing to notice hidden tabs or lines in spreadsheets which contain personal data (this is one of the causes cited in the PSNI case)
- Sensitive mail going to the wrong postal address (yes, a properly old-fashioned dead wood data breach!)
However, sometimes I hear about human error breaches and don’t think ‘how did someone accidently do that?’ Instead, I wonder…
- Why didn’t anyone spot the inherent risk of having ALL those records in an unprotected spreadsheet in the first place?
- Why wasn’t there a system in place to prevent people being able to forget to blind copy email recipients?
- Is anyone reviewing responses to Data Subject Access Requests or FOI requests? What level of supervision / QA exists in that organisation?
- Why is it acceptable for someone to take confidential papers out of their office?
I could go on.
Technical and Organisational Measures (TOMs)
Rather than human error, should we be blaming a lack of appropriate technical and organisational measures (TOMs) to protect personal data? A fundamental data protection requirement.
We all know robust procedures and security measures can mitigate the risk of human error. A simple example – I know employees who receive an alert if they’re about to send an attachment containing personal data without a password.
Alongside this, data protection training is a must, but it should never be a ‘tick box’ exercise. It shouldn’t be a case of annual online training module completed; no further action required! We need to make sure training is relevant and effective and delivers key learning points and messages. Training should be reinforced with regular awareness campaigns. Using mistakes (big or small) as case studies are a good way to keep people alert to the risks. This is another reason why post-event investigation is so important as a lesson-learning exercise.
Rather than being a liability, if we arm people with enough knowledge they can become our greatest asset in preventing data breaches.
Chatting with my husband about this, he mentioned a boss once asking him to provide some highly sensitive information on a spreadsheet. Despite the seniority and insistence of the individual, my husband refused. He offered an alternative solution, with protecting people’s data at heart. Armed with enough knowledge, he knew what he had been asked to do was foolhardy.
Lessons from previous breaches
It’s too early to call what precisely led to these recent breaches:
- The Police Service of Northern Ireland releasing a spreadsheet containing the details of 10,000 police officers and other staff public in response to a Freedom of Information Request
- Norfolk and Suffolk Police accidentally releasing details of victims and witnesses of crime
- Scottish genealogy website revealing thousands of adopted children’s names.
However, we can learn from previous breaches and the findings of previous ICO investigations.
You may recall the case of Heathrow Airport’s lost unencrypted memory stick. Although ostensibly a case of human error, the ICO established the Airport failed not only ‘to ensure that the personal data held on its network was properly secured’, but also failed to provide sufficient training in relation to data protection and information security. The person blamed for the breach was unaware the memory stick should have been encrypted in the first place.
Then there was the Cabinet Office breach in which people’s home addresses we published publicly in the New Year’s Honours list. The actual person who published the list must’ve had a nightmare, when they realised what had happened. But the ICO findings revealed a new IT system was rushed in and set up incorrectly. The procedure given for people to follow was incorrect. A tight deadline meant short-cuts were taken. The Cabinet Office was found to have been complacent.
The lesson here? Data breaches aren’t always solely the fault of the person pressing the ‘send’ button. Too often, systems and procedures have already failed. Data protection is a mindset. A culture. Not an add-on. As the PSNI has sadly discovered, in the most awful of circumstances.
The impact breaches can have on employees, customers, victims of crime, patients and so on, can be devastating. Just the knowledge that their data is ‘out there’ can cause distress and worry.
Data protection law doesn’t spell out what businesses must do. To know where data protection risks lie, we need to know what personal data we have across the business and what it’s being used for. Risks need to be assessed and managed. And the measures put in place need to be proportionate to the risk.