EU AI Act – Quick Factsheet

February 2023

Use of artificial intelligence in the EU to be regulated

In early February European Union member countries unanimously gave a green light to a new Artificial Intelligence Act, following lengthy negotiations and overcoming fears it would stifle European innovation.

The EU AI Act, now needs to be signed-off from EU lawmakers. It’s anticipated it will come into force later this year, with a 36-month implementation period.

The Act aims to ban unacceptable use of artificial intelligence and introduce specific rules for AI systems proportionate to the risk they pose. It will impose extensive requirements on those developing and deploying high-risk AI systems.

It’s likely the Act won’t just govern AI systems operating in the EU, with it’s scope extending to foreign entities which place AI systems on the market or put them into service in the EU.

The definition of AI systems in the Act is one proposed by the OECD: “An AI system is a machine-based system that infers from the input it receives how to generate outputs such as predictions, content, recommendations, or decisions that can affect physical or virtual environments.”

Quick AI Act Factsheet

1. Banned applications

There will be prohibited uses of AI which threaten democracy and people’s rights. For example this includes but is not limited to; biometric categorisation systems which use special category data, real-time and remote biometric identification systems (such as facial recognition) and emotion recognition in the workplace and educational institutions.

2. Law enforcement and national security exemptions

There will be a series of safeguards and narrow exemptions allowing for the use of biometric identification systems in publicly accessible spaces for law enforcement purposes. The legislation will not apply to systems which are exclusively used for defence or military applications.

4. Tiered risk-based approach

The requirements organisations will need to meet, will be tiered dependent on the risk. For example;

  • For AI systems classified as high-risk there will be core requirements, such as mandatory fundamental rights impact assessments, registration on a public EU database, data governance, transparency, human oversight and more.
  • General-purpose AI (GPAI) systems, and the GPAI they are based on, will need to adhere to transparency requirements, including having technical documentation, being compliant with EU copyright law and having detailed summaries about the content used for training systems.
  • For Generative AI applications, people will have to be informed when they are interacting with AI, for example a Chatbot.

5. Right to complain

People will have the right to launch complaints about AI systems and receive explanations about decisions based on high-risk AI systems which impact their rights.

6. Higher fines than GDPR

Non-compliance with the rules could lead to fines of up to 35 million Euros or 7% of global annual turnover. This is a notable hike from GPDR which sets a maximum of 4% of annual worldwide turnover.

 

The EU AI Act represents the World’s first comprehensive legislative framework for regulating AI. It’s anticipated it will become a global standard, like GPDR has for data protection.

What’s clear is organisations need to take steps now to raise awareness and upskill employees. For example in compliance teams, legal, data protection, security and (by no means least) product development.

Decisions should be made about who needs a greater understanding of AI, how it will be regulated and where responsibilities for AI governance rest within the organisation.

As for the UK, some are calling on the Government to include AI in the Data Protection and Digital Information Bill. Conversely, others are warning against hastily-made regulation in this area.

Google Analytics: GA4 vs Universal Analytics – What will change?

July 2022

Will GA4 improve compliance?

For any users of Google Analytics, you will have started to see some messaging warning that the Universal Analytics tools will be retired in 2023 and that now is the time to migrate across to Google Analytics 4.

 What is Google Analytics 4 (GA4)? 

GA4 is a new property that helps analyse the performance of your website and app traffic and will replace Universal Google Analytics. It was first released in October 2020 although it’s only now that the campaign to migrate across has started in earnest. 

 Key components include: 

  • Event-based tracking: Universal Analytics is session-based, while GA4 is event–based. In other words, the ability to track events like button clicks, video plays, and more is built in with GA4, while this requires advanced setups in UA. This comes from the premise that page views aren’t the sole important metric.
  • Cross-device tracking: UA was built around desktop web traffic, while GA4 gives businesses visibility into the customer journeys across all of their website and apps.
  • Machine learning: GA4 uses machine learning technology to share insights and make predictions.
  • Privacy-friendly: UA data relies heavily on cookies, GA 4 does not.

Crucially, on July 1, 2023, standard Universal Analytics properties (the previous version of Google analytics) will no longer process data. You’ll be able to see your Universal Analytics reports for a period of time after July 1, 2023. This means that to have a continuous history of activity, it makes sense to move across to the new GA4 platform sooner rather than later. 

What privacy improvements have been made?

GA4 came with a set of new privacy-focused features for ticking GDPR boxes including: 

  • Data deletion mechanism. Users can now request to surgically extract certain data from the Analytics servers via a new interface. 
  • Shorter data retention period. You can now shorten the default retention period to 2 months (instead of 14 months) or add a custom limit.  
  • IP Anonymisation. GA4 doesn’t log or store IP addresses by default. They allocate an anonymous and unique user id to each record
  • First-party data cookies. Google uses first-party cookies which means they’ll still be supported by browsers
  • More data sampling. Google is doing more data sampling using AI to gain more granular analytics insights – this is more privacy friendly and uses models to investigate deeper insights
  • Consent mode. The behaviour of Google tags is managed based on user consent choices. 
  • Collecting PII. Google does not allow the collection of PII in GA4 –  this is considered a violation of Googles terms of service
  • Data sharing with other Google Products. Any linking to Google advertising products requires explicit opt-in consent and a prominent section on the privacy notice 

Is Google now compliant?

Possibly in limited circumstances. If Google anonymises the data by allocating a user id that is never referenced with any other data then we can argue the data is anonymous and therefore not subject to GDPR regulation.

In some instances, this may be the case if you are doing simple tracking and effectively treat your digital platforms as an ivory tower. In most instances, it is not!

If you are advertising and can then link the id to other data, there is the potential to identify individuals and therefore the information becomes personal data and subject to GDPR.

This means that all the usual user consent rules apply and opt-in consent is required to analyse activity.

The major difficulty for Google is that data is exported to the US where it is deemed, by the EU, that Google does not adequately protect EU personal data from US surveillance rules. 

Previously, Google relied on the Privacy Shield framework to ensure that it remained compliant. Since that has been invalidated in 2020, Google has struggled to achieve compliance and has faced a number of fines.          

In particular, Google Analytics does not have a way for:

·       Ensuring data storage within the EU

·       Choosing a preferred regional storage site

·       Notifying users of the location of their data storage and any data transfers outside of the EU

What next?

Ideally, Privacy Shield 2.0 will be introduced soon! Talks have started but they’re unlikely to be swift! The US government has been talking about making its surveillance standards “proportional” to those in place in the EU. This may not be good enough for CJEU. 

In the meantime, implement GA4 as it is more privacy-focused than Google Universal Analytics and hope that US and EU come to an agreement soon. There is a risk in using GA4 and you might want to consider using other solutions.

Managing data transfers from the UK

February 2022

The new International Data Transfer Agreement (IDTA) and Addendum is a sensible evolution of the old SCCs

International Data Transfers – to recap

Whenever UK-based organisations arrange the transfer of personal data to a third country outside the UK, they need to make sure the transfers are lawful, by confirming the data security and rights of individuals remain protected when data leaves the country.

Since the famous “Schrems II” ruling by the European Court of Justice in 2020, this activity has been thrown into disarray. To remind you, this is the ruling which invalidated the EU-US Privacy Shield and raised concerns about the use of EU Standard Contractual Clauses (SCCs) to protect the data. 

Soon after, the European Commission set to work to update the EU SCCs. These were drafted and enacted fairly swiftly taking effect on 27th June 2021. 

What are the new EU SCCs?

The new EU SCCs were expanded to introduce more flexible scenarios: 

  • SCCs are now modular meaning that they can accommodate different scenarios, where you can pick the parts which relate to your particular situation.
  • The SCCs cover four different transfer scenarios, including processors:
    • Controller to controller
    • Controller to processor
    • Processor to controller
    • Processor to processor
  • More than two parties can accede to the SCCs, meaning additional controllers and processors can be added through the lifetime of the contract. This potentially reduces the administrative burden.

How did this affect the UK? 

On 28th June the UK’s adequacy decision was adopted.  On September 27th 2021, the prior version of the SCCs expired. 

In our webinar last year, it was obvious that everyone was confused. The situation caused by the “Schrems” ruling was compounded by the fact that Brexit had been completed. This meant we could no longer apply the SCCs approved in Europe. The UK needed its own SCCs, but they did not exist. 

The ICO consultation

From August to October 2021, the ICO conducted a consultation to understand how a UK version of these rules should be enacted. Since we had been granted an adequacy agreement by the EU, we all hoped it would be possible to mirror the SCCs arrangements in UK law thus re-instating the means by which we can lawfully export data to places such as the US. 

Anecdotally the resounding view was not to mess with the principles enshrined in the EU SCCs as it would simply add complexity to an already complex situation.

The ICO conclusion

In January, the ICO published the International Data Transfer Agreement (IDTA) and the International Data Transfer Addendum to the EU Commission Standard Contractual Clauses. To the layperson, the EU’s standards have been adopted. 

What’s included in the Agreement and Addendum? 

    1. The International Data Transfer Agreement (IDTA) replaces the old EU SCCs which were relied upon to provide the appropriate safeguards required under the UK GDPR for international data transfers from the UK. There are differences to the new EU SCCs – it is a single all-encompassing agreement that incorporates all the scenarios identified in EU SCCs. One can omit sections and there is no requirement for it to be signed. This is most useful for those creating new data transfer agreements.
    2. The UK Addendum is a far simpler document. It is an addendum to the EU SCCs where references to EU laws are replaced by references to UK laws. It allows businesses to use the EU SCCs for international data transfers from the EU but also from the UK. These are useful for those already using the EU SCCs who want a simple addendum to update the legal context. 

When does this come into force?

The IDTA was laid before Parliament on 2nd February 2022. It comes into force on 21st March if there are no objections. To all intents and purposes, it’s in force now. The Information Commissioner Office (ICO) has stated the IDTA and UK Addendum:

“are immediately of use to organisations transferring personal data outside of the UK, subject to the caveat that they come into force on 21 March 2022 and are awaiting Parliamentary approval“.

What does this all mean?

In practice, UK businesses can breathe a sigh of relief and get on with their lives. There is clarity at last. Existing agreements need to be updated with the UK Addendum and new ones can be put in place with the International Data Transfer Agreement. There will be an administrative burden, but businesses now know what they need to do.  Good sense has prevailed. 

 

You’ve been SAR-bombed!

July 2020

You are at the end of long day; just about to turn in for the night. You just do one last check of your inbox for any signs of a reported security incident. Suddenly you are aghast, the new email count in your inbox registers over 9,000 new emails! You quickly scan to fathom what on earth has happened…

All the emails come from the same sender and the subject lines all declare they are SAR (Subject Access Request) requests. Looking closer you note the emails include personal information, describe that “so-and-so” wants to exercise a privacy right and references different privacy laws.

Laws you know require you reasonably address privacy requests, with penalties should you fail to address the request in good faith and in a timely manner.

While I hope you never experience 9,000 requests in one hit, people seem to be increasingly relying on third parties and apps to facilitate their privacy rights. Indeed, some third-party portals are actively encouraging people to use their services.

Once your organisation is identified, you are likely to receive requests from the third party’s entire user base; all delivered to the email address published via your privacy statements.

Let’s explore this trend in more detail and give you a glimpse of how to tackle the SAR-bomb experience.

The Dawn of Privacy Preference Apps

Chances are you’ve already received or honoured an individual’s privacy request received via a third party in some fashion or another. Country and channel specific regulatory “do not contact” lists have for some years allowed people to ‘opt-out’ of direct marketing “en masse.” Some third parties offer people template letters to express privacy choices with a pre-defined list of organisations that should receive them.

Mobile apps are also available to help individuals exercise their requests. One such app seeks to help individuals to identify organisations they have previously transacted with for the purposes of exercising their privacy rights and another is designed to help individuals address legal disputes.

Of course, California’s Consumer Privacy Act (CCPA) now requires organisations to process privacy requests delivered by third parties (defined as “authorised agents”). As the world’s sixth largest economy, CCPA’s “authorized agent” mandates are likely to be replicated and influence individual’s expectations beyond California.

Mindset

When addressing privacy requests delivered to you via third parties, be sure your response plan considers first the people submitting these requests. They’ve already invested some time and energy and may have even paid for the help these parties and solutions offer.

People may have turned to such third parties to assert control over their data in as broad a manner possible. Some may be frustrated, confused or upset, and others may not be aware or care that your organisation has specific obligations under the law.

Your procedures to authenticate identity, validate the processing of personal data, address requests within your organisation and ensure the security of the data in your care, are likely of little concern to individuals.

Even though the law may require you to separately affirm certain requests received online, some individuals simply won’t appreciate your attempts to confirm the authenticity of their requests.

Furthermore your requests of people to follow your processes may be met with frustration, indifference and scepticism; especially when you need them to take additional action to facilitate their original request.

Your experience addressing sensitive SAR requests, such as those associated by disgruntled employees or customers punishing you for bad service, can be especially useful.

Getting to Work

With the individual’s mindset front and centre, let’s shift attention to some of considerations specific to being SAR-bombed. Time is of the essence and you need a systematic approach to establish whether you will deny, partially or fully comply with the request.

  • Get your arms around the situation – At a minimum, you need to identify each individual, extract the personal data (as needed to authenticate their identity and confirm the data exists within your organisation) and define the rights they wish to exercise. Conduct a quick test to see how much time is needed based on the total volume.

In our example, let’s say it takes you just 90 seconds to open one of emails, log the relevant details to your SARs system and archive the email. At 9,000 requests, you may need 225 hours to convert these SAR emails into requests that make sense within your organisation.

  • Create a structured dataset – The volume of SARs simply requires a repeatable process designed to convert the unstructured privacy email into a structured request that makes sense within your organisation. It may help to create a solution that can parse emails for relevant details and return data back to you in a structured format.

If your email platform supports it, consider exporting all the SAR emails into a Comma Separated Values or “CSV” file. Once in a CSV file, you can use your favourite spreadsheet program to make short work of your analysis and response.

  • Include key details within your structure dataset – Consider assigning a unique identifier specific to the request and sender to help you demonstrate the original request across the actions needed to address it. Pull forward the personal data related to the request in a way which reflects your existing SARs authentication and matching procedures.

You may also extract demographic information across specific columns; especially useful if the requests reference rights across different jurisdictions or laws. Denote the privacy right (or rights) for each request. Be sure to use terms your organisation understands to save time.

Consider assigning a reference to the jurisdiction (or law) applicable to the request; or the individual involved. For example, it may be useful to validate GDPR requests originating from Europeans differently from CCPA requests from Californians.

  • Questions relevant to developing your strategy

a. Do you have multiple requests for the same individual? Check if you have duplications i.e. the same individual requesting the same right.
b. Do you have requests that aren’t legally required? Check if those exercising a right are indeed subject to the right or law referenced. For example, is the individual a European (if referencing GDPR) or a Californian (if referencing CCPA)? Dependent on the volume and results of this analysis, you may need to address requests subject to the law first.
c. Can you act on the request as presented? Do you have evidence the third party has authority to act on the individual’s behalf? Are you able to verify their identity? If you need more information your response plan also needs to factor in developing and sending communications, and addressing the responses.

  • Creating records to demonstrate your reasonable efforts – Regardless of your specific response plan, be sure to keep records detailing what you did and the decisions you made. This may include:

1) details of your actions to assess the request
2) communications with the individual
3) actions taken internally to address the request
4) summary of results (for example whether you denied, partially or fully complied)
5) the timeframe taken to resolve

Adopting the approach above, my company, Harte Hanks, has addressed 9,254 email requests within just a few days. We identified that 96% of the requests delivered were simply duplicates.

The “sender” seems to have experienced a technical problem, delivering the same request on average at least 44 times and one over 1,600 times. Of the 326 “unique” requests delivered, 67 requests described rights under CCPA whereas the other 259 described rights under GDPR.

When considering the personal data delivered along with the request, we found all CCPA requests included personal details reasonably descriptive of a Californian whereas only 16 of the remaining “GDPR” request reasonably “described” a European.

Here’s to hoping you don’t ever experience such a deluge of requests at one time.

Further information

In the UK, the Information Commissioner’s Office addresses requests made via third party portals in its detailed Right of Access Guidance.

The ICO says to determine whether you need to comply with such a request you should consider whether you are able to verify the identity of the individual and are satisfied the third party portal is acting with the authority of and on behalf of the individual in question.

The regulator stresses you are not obliged to take proactive steps to discover that a SAR has been made. So, if you can’t view the SAR without paying a fee or signing up to a service, you have not ‘received’ a SAR and are not obliged to respond.

Furthermore, it’s the portal’s responsibility to provide evidence that it has appropriate authority to act on the individual’s behalf. In responding to a SAR you are not obliged to pay a fee or sign up to a third party service. If you are in this position the regulator’s advice is to provide the information to the individual directly.  The draft code states:

“If you have concerns that the individual has not authorised the information to be uploaded to the portal or may not understand what information would be disclosed to the portal, you should contact the individual to make them aware of your concerns.”