Use of automated facial recognition by South Wales Police ruled ‘unlawful’

August 2020

The Court of Appeal has upheld a legal challenge against the use of automated facial recognition (AFR) technology by South Wales Police (SWP).

The appeal was brought by Ed Bridges from Cardiff, backed by the civil rights group Liberty.

The AFR technology in question uses cameras to scan faces within a crowd, then matches these images against a ‘Watch List’ (which can include images of suspects, missing people and persons of interest). This flags up potential matches to officers.

Mr Bridges argued his human rights were breached when his biometric data was analysed without his knowledge or consent.

Liberty’s barrister, Dan Squires QC, argued there were insufficient safeguards within the current laws to protect people from an arbitrary use of the technology, or to ensure its use is proportional.

The Court upheld three of the five specific points of appeal, finding that:

  • There was no clear guidance on where AFR Locate (the technology used) could be used and who could be put on a watchlist. The Court held that this was too broad a discretion to afford to the police officers to meet the standard required by law under Article 8 of the Human Rights Convention.’The Court decided the level of discretion given to police officers was too great to meet the required standard under human rights law (Article 8 of the Human Right Convention)
  • The Data Protection Impact Assessment (DPIA) carried out by South Wales Police was found as ‘deficient’ because it was written on the basis that Article 8 of the Human Rights Convention was not infringed.
  • SWP did not take reasonable steps to find out if the software had a bias on racial or gender grounds.

 

This successful appeal followed the dismissal of the case at the Divisional Court on 4 September 2019 by two senior judges, who concluded that use of AFR technology was not unlawful.

Talking about the latest verdict, Mr Bridges commented:

“I’m delighted that the court has agreed that facial recognition clearly threatens our rights. This technology is an intrusive and discriminatory mass surveillance tool.

“For three years now, South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance.”

SWP have confirmed that they do not seek to appeal against the Court of Appeal’s judgment.

What impact is this ruling on facial recognition likely to have?

The ruling’s impact will extend across other police forces. However, it may not prevent them from using AFR technologies in the future.
The judges commented the benefits from AFR are “potentially great” and the intrusion into people’s privacy were “minor”. However, more care is clearly needed regarding how it’s used.

To move forward, police forces will need clearer more detailed guidance. For example, the ruling indicates officers should document who they are looking for and what evidence they have that those targets are likely to be in the monitored area.

The England and Wales’ Surveillance Camera Commissioner, Tony Porter, suggested that the Home Office should update their Code of Practice.

It will be interesting to watch how this develops. The benefits clearly need to be carefully balanced with the privacy risks.