Could the Right of Access leave consumers open to privacy risks?
News that an Oxford University PhD student and cyber security researcher, managed to collect personal data about his girlfriend, including sensitive personal information, by submitting 150 Data Subject Access Requests (DSARs) is likely to prompt some organisations to review their identity verification processes.
James Pavur, with his girlfriend’s permission, assumed her identity using limited details and a fake email address. He went on to elicit information about her from a number of organisations using the DSAR process. He presented the findings of his report; GDPArrrr: Using Privacy Laws to Steal Identities at a recent Black Hat conference in Las Vegas.
His paper reveals that of the 84 companies that responded, only 39% percent requested what he considered to be a strong form of identity verification. He says others either requested no further proof or had weak verification procedures.
Mr Pavur believes the legal ambiguity surrounding the Right of Access, and a lack of standard practice to verify the identity of the requestor, could inadvertently lead to abuse by social engineers. He suggests that medium sized business are most at risk. His paper states; ‘We find that many organizations fail to employ adequate safeguards against Right of Access abuse and thus risk exposing sensitive information to unauthorized third parties.’ He also raises specific concerns that the short time limit organisations have to respond to DSARs and potential sanctions for failing to respond in time, may result in organisations taking short cuts.
What did James Pavur do?
In a first tranche of requests, Mr Pavur says he used basic information to submit DSARs from a ‘made up’ email address. This information was limited to his girlfriend’s address, full name, and other professional email addresses and telephone numbers (he maintains the latter could be found via publicly available sources). He then designed an access request letter, deliberately worded to make it tricky for the organisation receiving the request. For example, the request was broad and specifically requested information that third parties may hold on the organisation’s behalf:
‘In particular, please supply any personally identifiable information that your organization (or a third party organization on your behalf) stores about me. Please include data that your organization holds about me in your digital or physical files, backups, emails, voice recordings or other media you may store.’
It specified a request for information sourced from third parties:
‘If you are additionally collecting personal data about me from any source other than me, please provide me with information about these sources, as referred to in Article 14 of the GDPR.’
(Albeit, Article 15(g) would be most appropriate here, ‘where personal data are not collected from the data subject’ organisations should provide ‘any available information as to the source’)
It included raising a level of concern for the organisation that the requester may have reason to believe their personal data had been breached:
‘Finally, I would like to request information regarding if my personal data has been disclosed inadvertently by your company in the past, or as a result of a security or privacy breach.’
It acknowledged the organisation’s ability to request proof of id, but perhaps made this sound somewhat onerous to fulfil:
‘If you require identity documents to complete these requests, provided that the sensitivity of these documents is proportional to the data I have already consented to allow your organization to store, I am willing to provide these documents via a secure, online portal as soon as possible.’
Mr Pavur was of the view that many organisations were unlikely to have a secure online portal to facilitate this.
In his second tranche of requests, Mr Pavur used information disclosed in the first to elicit more personal data. He stresses that he did not forge any proof of ID documents or his girlfriend’s signature, although he raises concerns that malicious attackers could deploy such tactics.
What was the response?
Of the 150 companies approached (in the UK and US), 84 confirmed they were storing information about his girlfriend. Of these Mr Pavur says;
- 39% insisted on a strong form of identification
- 24% responded without further inquisition
- 16% accepted a weak form of identification
Interestingly, 5% of companies said they did not fall under the requirements as they were based in America and 3% took the step of immediately deleting the personal data they held, rather than disclosing it.
What’s not made clear by Mr Pavur is whether any organisations sought to initially clarify the nature of the request. The ICO’s Right of Access guidance says; ‘If you process a large amount of information about an individual you can ask them for more information to clarify their request. You should only ask for information that you reasonably need to find the personal data covered by the request.’ However, the guidance is clear that if an individual refuses to provide any additional information, organisations must endeavour to still comply with the request by making reasonable searches for the information covered by the request.
What personal data was disclosed?
The information Mr Pavur says he gleaned about his girlfriend ranged from simple public records through to more sensitive information. This included:
- An education services provider disclosing her social security number, mother’s maiden name and high school grades
- A threat intelligence company, which analyses data dumps to determine if organisations have been breached or compromised, provided a list of his girlfriend’s old passwords (deemed to have been compromised in past breaches). This is of concern given the known end-user habit of using variations of the same passwords
- A UK hotel chained disclosed a list of locations where his girlfriend has stayed (not good if he had been trying to find evidence of any infidelity). A travel company also revealed her past journeys
- He also managed to gather some financial information about her, including 10 digits from her credit card and the expiry date.
The organisations clearly believed they were providing personal data to the individual it related to. However, their failure to confirm this, would represent a clear breach under data protection law.
What vulnerabilities does Mr Pavur believe this exposes?
Mr Pavur warns his approach was not as sophisticated as it could have been. He says his findings, ‘suggest a critical need to improve the implementation of the subject access request process’. Mr Pavur perceives the key vulnerabilities to be:
Time pressures:
The statutory pressure organisations are under to respond to DSARs within a short time period (without undue delay and at least within one calendar month). Furthermore, failure to meet this tight deadline leaves organisations open to potential sanctions. Ironically, this means prompt compliance could potentially lead to an enhanced chance of personal data being compromised by social engineers.
It’s important to point out that Subject Access Rights are not new nor invented by GDPR – I was handling them more than a decade ago. However, increased publicity and the removal of the £10 fee has meant many organisations have seen a significant rise in requests or indeed may be receiving them for the first time. This means increased resources are required to fulfil the right within a shorter time frame than previously allowed. The £10 fee was clearly seen by the law makers as hindering consumers ability to access their data, however could its removal have inadvertently made it easier for malicious requests?
Mr Pavur’s experiment goes to highlight it doesn’t matter how simple or complex a request is, organisations are under the same time constraints, his report states, ‘Given that an organization typically has only one calendar month to respond to any given GDPR request (with limited extensions of up to one additional month), broad or complicated GDPR requests can be difficult to respond to within the allotted timeframe. Under these pressure dynamics, we hypothesized that organizations may be tempted to take shortcuts or be distracted by the scope and complexity of a request and pay less attention to the identity verification aspects of the law.’
Matthew Kay, Data Protection Officer at Thomson Reuters says, ‘The research shows that organisations, in facing stringent statutory time-frames for DSARs, can be liable to make mistakes. Whilst it’s important for organisations to meet these statutory time-frames they must not lose oversight of their overall compliance programme, ensuring robust measures are in place to correctly identify individuals making DSARs, to avoid an incorrect disclosure of personal data to a third party.’
Proof of Identity:
Mr Pavur specially points to the lack of clarity on what constitutes proof of identity, and his findings reveal that in some cases proof of id was not sought and in others he believes could have been easy to fake.
Arguably, this is the most important point, time pressures aside.
So, what does GDPR say about verifying the identity of an individual? Recital 64 states:
‘The controller should use all reasonable measures to verify the identity of a data subject who requests access, in particular in the context of online services and online identifiers. A controller should not retain personal data for the sole purpose of being able to react to potential requests.’
This puts the onus on organisations to decide what is “reasonable”. What would be too little, what would be reasonable and what would be excessive? This is not stipulated by law and clearly open to widespread and possibly inconsistent interpretation.
The ICO Right of Access guidance states: ‘If you have doubts about the identity of the person making the request you can ask for more information. However, it is important that you only request information that is necessary to confirm who they are. The key to this is proportionality. You need to let the individual know as soon as possible that you need more information from them to confirm their identity before responding to their request. The period for responding to the request begins when you receive the additional information.’
This still leaves questions – What is reasonable? What is proportionate? This has led many organisations to consider the type of personal data they hold for an individual, balancing this with the level of proof they request. An issue we covered in our webinar hosted by Bristows LLP last year: Data Subject Access Requests: Your burning questions answered
Some organisations may take the approach that where limited personal data is held, a less robust identification process would be more reasonable and proportionate. The obvious corollary is where more sensitive data is to be disclosed, it should be more robust. I think we’d all hope that if, for example, we were requesting medical information, there would be a robust process for ensuring this couldn’t be disclosed to the wrong person.
Mr Pavur also points to what he describes as ‘novel forms of knowledge-based identity verification’, which he says is perhaps motivated by the proportionality test. For example, he says, some organisations requested, “knowledge of the last retail location the data-subject visited or information about the account creation date.” He accepted that this sort of information was beyond the scope of the remit he devised.
In conclusion
Mr Pavur found very large companies, especially tech companies, tended to have robust processes in place, while smaller companies tended to ignore the request completely.
He perceives the biggest weakness is with mid-sized businesses who don’t have the specialised processes. I would add, nor perhaps adequate resource to handle requests.
To mitigate the risks, Mr Pavur believes companies should require account logins (if available) or should in fact be outsourcing the verification process, if they can’t adequately handle it themselves. He also suggests it should be easier for organisations be able to refuse a request they feel it’s suspicious. If this were to be the case organisations would still be left with ensuring there was clear justification for suspecting this.
Mr Pavur goes further, recommending that European legislations take steps to reassure companies that they can reject requests in good faith. He also calls for clarification on appropriate forms of identity verification – in theory this may prove easier than in practice, especially given the global scope of GDPR – and the different types of proof of identity individuals in different countries might have.
Robert Bond, partner at Bristows LLP comments, ‘Mr Pavur’s research highlights the need for businesses to improve their DSAR protocols, but also shows that for those organisations that handed over details relating to his girlfriend they had no process for identifying that he, Mr Pavur, was not entitled to her personal data in the first place. In sharing her personal data, they were in breach of a number of aspects of the Data Protection Act 2018, including data protection by default and the breach of her data protection rights.’
Another issue not raised by Mr Pavur, concerns providing organisations with even more information than they already hold, in the name of verification. I recently received an email notifying me that a company I brought a product from (more than 4 years ago) had suffered a data breach. If I were to submit a DSAR, I would be very reluctant to provide a company that had already breached my data, with any more personal details.
The risk is also evident that an unscrupulous social engineer might seek multiple pieces of information that may seem, on their own, insignificant. However, pieced together, those tiny nuggets of information could be used to make a bigger picture. Threat actors from fraudsters to stalkers to corporate intelligence types would be obvious beneficiaries of such ‘blagging’. We saw during the telephone hacking enquiry how unscrupulous journalists used telephone and financial data to spy on celebrities and newsworthy members of the public – could the DSAR arrangements ironically provide a new opportunity?
Mr Pavur’s work is perhaps a wake-up call to those new to handling an increased level of DSARs, but poses more questions than answers for business and regulators both. Does a standardised approach to ID verification pose a threat or an opportunity? Is more regulation and resource allocation the answer, and do the ICO need to reconsider DSAR guidance? We will be watching and reporting on any changes among key stakeholders and opinion-formers in our industry.
By Philippa Donn, August 2019
Copyright DPN
The information provided and the opinions expressed in this document represent the views of the Data Protection Network. They do not constitute legal advice and cannot be construed as offering comprehensive guidance on the EU General Data Protection Regulation (GDPR) or other statutory measures referred to.