EU-U.S. Data Privacy Framework – how long will it last?

What does this mean and are legal challenges expected?

The European Commission has adopted its adequacy decision for the EU-U.S. Data Privacy Framework (DPF). The EC confirmed the DPF gives protection to personal data transferred which is comparable to that provided within the EU.

The new framework enters into force immediately, as of 11th July 2023. This decision provides a new lawful means for data transfers from exporters based in the EU to the U.S.

It works in a similar way to the previous Privacy Shield, and will only apply where US organisations certify compliance with the DPF’s principles.

It’s proposed the UK-US ‘Data Bridge’ will shortly piggyback off this EU-US agreement.

U.S. says commitments have been met

For the EC to grant this adequacy decision, it’s taken significant changes to U.S. intelligence gathering activities. The EC’s decision was made a few days after the U.S. announced it had completed the key commitments under President Biden’s executive order regarding the DPF. A press release published by the European Commission confirmed:

“The EU-U.S. Data Privacy Framework introduces new binding safeguards to address all the concerns raised by the European Court of Justice, including limiting access to EU data by U.S. intelligence services to what is necessary and proportionate and establishing a Data Protection Review Court.”

Robert Bond, Senior Counsel at Privacy Partnerships and Chair of the DPN Advisory Group commented:

“The new framework introduces significant improvements compared to the mechanism that existed under the Privacy Shield. The safeguards put in place by the US will also facilitate transatlantic data flows more generally, since they apply when data is transferred by using other tools, such as SCCs and BCRs and as the DPF is an adequacy decision by the EU in respect of the data privacy regime in the US, this may simplify the EU transfer impact assessment requirements.”

Self-certification

Crucially, US based data importers must certify their compliance with the DPF principles. These are an updated version of the previous Privacy Shield principles. Organisations which were certified under the Privacy Shield are likely to be in a good position to self-certify under the DPF.

To join the DPF, an eligible organisation must develop a privacy policy which conforms to expected standards, identify an independent recourse mechanism and self-certify through the U.S. Department of Commerce’s DPF website.

EU-based data exporters will be able to check a list on the DPF website to see if a US organisation is certified or not.

Legal challenge is on its way

Both of the past EU-U.S. data transfer frameworks, Safe Harbor and Privacy Shield, were ruled invalid by the Court of Justice of the European Union (CJEU). Concerns are therefore likely to remain about the longevity of the DPF.

noyb, headed up by the infamous Austrian Max Schrems, has already stated it’s view the ‘New Trans-Atlantic Data Privacy Framework is largely a copy of Privacy Shield’ and confirmed it plans to challenge the EC’s decision. So watch this space!

EU AI Act – Quick Factsheet

February 2023

Use of artificial intelligence in the EU to be regulated

In early February European Union member countries unanimously gave a green light to a new Artificial Intelligence Act, following lengthy negotiations and overcoming fears it would stifle European innovation.

The EU AI Act, now needs to be signed-off from EU lawmakers. It’s anticipated it will come into force later this year, with a 36-month implementation period.

The Act aims to ban unacceptable use of artificial intelligence and introduce specific rules for AI systems proportionate to the risk they pose. It will impose extensive requirements on those developing and deploying high-risk AI systems.

It’s likely the Act won’t just govern AI systems operating in the EU, with it’s scope extending to foreign entities which place AI systems on the market or put them into service in the EU.

The definition of AI systems in the Act is one proposed by the OECD: “An AI system is a machine-based system that infers from the input it receives how to generate outputs such as predictions, content, recommendations, or decisions that can affect physical or virtual environments.”

Quick AI Act Factsheet

1. Banned applications

There will be prohibited uses of AI which threaten democracy and people’s rights. For example this includes but is not limited to; biometric categorisation systems which use special category data, real-time and remote biometric identification systems (such as facial recognition) and emotion recognition in the workplace and educational institutions.

2. Law enforcement and national security exemptions

There will be a series of safeguards and narrow exemptions allowing for the use of biometric identification systems in publicly accessible spaces for law enforcement purposes. The legislation will not apply to systems which are exclusively used for defence or military applications.

4. Tiered risk-based approach

The requirements organisations will need to meet, will be tiered dependent on the risk. For example;

  • For AI systems classified as high-risk there will be core requirements, such as mandatory fundamental rights impact assessments, registration on a public EU database, data governance, transparency, human oversight and more.
  • General-purpose AI (GPAI) systems, and the GPAI they are based on, will need to adhere to transparency requirements, including having technical documentation, being compliant with EU copyright law and having detailed summaries about the content used for training systems.
  • For Generative AI applications, people will have to be informed when they are interacting with AI, for example a Chatbot.

5. Right to complain

People will have the right to launch complaints about AI systems and receive explanations about decisions based on high-risk AI systems which impact their rights.

6. Higher fines than GDPR

Non-compliance with the rules could lead to fines of up to 35 million Euros or 7% of global annual turnover. This is a notable hike from GPDR which sets a maximum of 4% of annual worldwide turnover.

 

The EU AI Act represents the World’s first comprehensive legislative framework for regulating AI. It’s anticipated it will become a global standard, like GPDR has for data protection.

What’s clear is organisations need to take steps now to raise awareness and upskill employees. For example in compliance teams, legal, data protection, security and (by no means least) product development.

Decisions should be made about who needs a greater understanding of AI, how it will be regulated and where responsibilities for AI governance rest within the organisation.

As for the UK, some are calling on the Government to include AI in the Data Protection and Digital Information Bill. Conversely, others are warning against hastily-made regulation in this area.

Managing data transfers from the UK

February 2022

The new International Data Transfer Agreement (IDTA) and Addendum is a sensible evolution of the old SCCs

International Data Transfers – to recap

Whenever UK-based organisations arrange the transfer of personal data to a third country outside the UK, they need to make sure the transfers are lawful, by confirming the data security and rights of individuals remain protected when data leaves the country.

Since the famous “Schrems II” ruling by the European Court of Justice in 2020, this activity has been thrown into disarray. To remind you, this is the ruling which invalidated the EU-US Privacy Shield and raised concerns about the use of EU Standard Contractual Clauses (SCCs) to protect the data. 

Soon after, the European Commission set to work to update the EU SCCs. These were drafted and enacted fairly swiftly taking effect on 27th June 2021. 

What are the new EU SCCs?

The new EU SCCs were expanded to introduce more flexible scenarios: 

  • SCCs are now modular meaning that they can accommodate different scenarios, where you can pick the parts which relate to your particular situation.
  • The SCCs cover four different transfer scenarios, including processors:
    • Controller to controller
    • Controller to processor
    • Processor to controller
    • Processor to processor
  • More than two parties can accede to the SCCs, meaning additional controllers and processors can be added through the lifetime of the contract. This potentially reduces the administrative burden.

How did this affect the UK? 

On 28th June the UK’s adequacy decision was adopted.  On September 27th 2021, the prior version of the SCCs expired. 

In our webinar last year, it was obvious that everyone was confused. The situation caused by the “Schrems” ruling was compounded by the fact that Brexit had been completed. This meant we could no longer apply the SCCs approved in Europe. The UK needed its own SCCs, but they did not exist. 

The ICO consultation

From August to October 2021, the ICO conducted a consultation to understand how a UK version of these rules should be enacted. Since we had been granted an adequacy agreement by the EU, we all hoped it would be possible to mirror the SCCs arrangements in UK law thus re-instating the means by which we can lawfully export data to places such as the US. 

Anecdotally the resounding view was not to mess with the principles enshrined in the EU SCCs as it would simply add complexity to an already complex situation.

The ICO conclusion

In January, the ICO published the International Data Transfer Agreement (IDTA) and the International Data Transfer Addendum to the EU Commission Standard Contractual Clauses. To the layperson, the EU’s standards have been adopted. 

What’s included in the Agreement and Addendum? 

    1. The International Data Transfer Agreement (IDTA) replaces the old EU SCCs which were relied upon to provide the appropriate safeguards required under the UK GDPR for international data transfers from the UK. There are differences to the new EU SCCs – it is a single all-encompassing agreement that incorporates all the scenarios identified in EU SCCs. One can omit sections and there is no requirement for it to be signed. This is most useful for those creating new data transfer agreements.
    2. The UK Addendum is a far simpler document. It is an addendum to the EU SCCs where references to EU laws are replaced by references to UK laws. It allows businesses to use the EU SCCs for international data transfers from the EU but also from the UK. These are useful for those already using the EU SCCs who want a simple addendum to update the legal context. 

When does this come into force?

The IDTA was laid before Parliament on 2nd February 2022. It comes into force on 21st March if there are no objections. To all intents and purposes, it’s in force now. The Information Commissioner Office (ICO) has stated the IDTA and UK Addendum:

“are immediately of use to organisations transferring personal data outside of the UK, subject to the caveat that they come into force on 21 March 2022 and are awaiting Parliamentary approval“.

What does this all mean?

In practice, UK businesses can breathe a sigh of relief and get on with their lives. There is clarity at last. Existing agreements need to be updated with the UK Addendum and new ones can be put in place with the International Data Transfer Agreement. There will be an administrative burden, but businesses now know what they need to do.  Good sense has prevailed.