Why the Tory app data breach could happen to anyone
Shakespeare wrote (I hope I remembered this correctly from ‘A’ level English), ‘When sorrows come, they come not single spies but in battalions.’ He could’ve been writing about the UK Conservative Party which, let’s be honest, hasn’t been having a great time recently.
The Telegraph is reporting the party suffered it’s second data breach in a month. An error with an app led to the personal information of leading Conservative politicians – some in high government office – being available to all app users.
Launched in April, the ‘Share2Win’ app was designed as a quick and easy way for activists to share party content online. However, a design fault meant users could sign up to the app using just an email address. Then, in just a few clicks, they were able to access the names, postcodes and telephone numbers of all other registrants.
This follows another recent Tory Party email blunder in May, where all recipients could see each other’s details. Email data breaches.
In the heat of a General Election, some might put these errors down to ‘yet more Tory incompetence’. I’d say, to quote another famous piece of writing, ‘He that is without sin among you, let him first cast a stone’! There are plenty of examples where other organisations have failed to take appropriate steps to make sure privacy and security are baked into their app’s architecture. And this lack of oversight extends beyond apps to webforms, online portals and more. It’s a depressingly common, and easily avoided.
In April, a Housing Associate was reprimanded by the ICO after launching an online customer portal which allowed users to access documents (revealing personal data) they shouldn’t have been able to see. These related to, of all things, anti social behaviour. In March the ICO issued a reprimand to the London Mayor’s Office after users of a webform could in click on a button and see every other query submitted. And the list goes on. This isn’t a party political issue. It’s a lack of due process and carelessness issue.
It’s easy to see how it happens, especially (such as in a snap election) when there’s a genuine sense of urgency. Some bright spark has a great idea, senior management love it, and demand it’s implemented pronto! Make it happen! Be agile! Be disruptive! (etc).
But there’s a sound reason why the concept of data proteciton by design and by default is embedded into data protection legislation, and it’s really not that difficult to understand. As the name suggests, data protection by design means baking data protection into business practices from the outset; considering the core data protection principles such as data minimisation and purpose limitation as well as integrity & confidentiality. Crucially, it means not taking short-cuts when it comes to security measures.
GDPR may have it’s critics, but this element is just common sense. Something most people would get onboard with. A clear and approved procedure for new systems, services and products which covers data protection and security is not a ‘nice to have’ – it’s a ‘must have’. This can go a long way to protect individuals and mitigate the risk of unwelcome headlines further down the line, when an avoidable breach puts your customers’, clients’ or employees’ data at risk.
Should we conduct a DPIA?
A clear procedure can also alert those involved to when a Data Protection Impact Assessment is required. A DPIA is mandatory is certain circumstances where activities are higher risk, but even when not strictly required it’s a handy tool for picking up on any data protection risks and agreeing measures to mitigate them from Day One of your project. Many organisations would also want to make sure there’s oversight by their Information Security or IT team, in the form of an Information Security Assessment for any new applications.
Developers, the IT team and anyone else involved need to be armed with the information they need to make sound decisions. Data protection and information security teams need to work together to develop apps (or other new developments) which aren’t going to become a leaky bucket. Building this in from the start actually saves time too.
In all of this, don’t forget your suppliers. If you want to outsource the development of an app to a third-party supplier, you need to check their credentials and make sure you have necessary controller-to-processor contractual arrangements and assessment procedures in place – especially if once the app goes live, the developer’s team still has access to the personal data it collects. Are your contractors subbing work to other third party subcontractors? Do they work overseas? Will these subcontractors have access to personal data?
The good news? There’s good practice out there. I remember a data protection review DPN conducted a few years back. One of the areas we looked at was an app our client developed for students to use. It was a pleasure to see how the app had been built with data protection and security at its heart. We couldn’t fault with the team who designed it – and as such the client didn’t compromise their students, face litigation, look foolish or be summoned to see the Information Commissioner!
In conclusion? Yes, be fast. Innovate! Just remember to build your data protection strategy into the project from Day One.