Police forces use of facial recognition software determined lawful – R (Bridges) v Chief Constable of South Wales Police [2019] EWHC 2341 (Admin)

“The algorithms of the law must keep pace with new and emerging technologies.”- at p[1]

The Facts

The Administrative Court, with Haddon-Cave LJ and Swift J sitting, has heard the first case on the lawfulness of the police using automated facial recognition software (“AFR”). The case concerned the South Wales Police (“SWP”) use of AFR in two instances where they allegedly recorded the image of the Claimant. Once on 21 December 2017 at Queen Street Cardiff and another at the Defence Procurement, Research, Technology and Exportability Exhibition (“the Defence Exhibition”) on 27 March 2018.

It, therefore, fell to be considered whether the use of AFR in these instances was lawful under the process of judicial review. The Claimant challenged the use of AFR being as being contrary to the law on three grounds:

1. Article 8 of the European Convention of Human Rights (“the Convention”)

That the use of AFR was an interference with the Claimants Article 8(1) rights and was neither “in accordance with the law” or “necessary” or “proportionate” per Article 8(2).

2. Data Protection

That use was contrary to s4(4) Data Protection Act (“DPA”) 1998 and s35 DPA 2018. Further, that AFR use falls within s64(1) DPA 2018 and therefore, that a data protection impact assessment must be carried out.

3. Public-sector equality duty claims

Under s149(1) Equality Act 2010 that the SWP failed to take into account of the fact that the use of AFR would result in a disproportionately higher rate of false-positive matches for women and minority ethnic groups. Therefore, the use of the program would indirectly discriminate. Accordingly, the SWP failed to take into account the relevant considerations from s149(1)(a)-(c) of the Act.

How AFR works

In order to apply the law to AFR the Court set out the processes undertaken by the technology. From compiling an existing database of images, facial image acquisition, face detection, feature extraction, face comparison to matching.

The Court also considered the fact that a specific type of software, AFR Locate, was used in both instances. Locate takes digital images of faces of members of the public from live CCTV feeds and processes them in real-time to extract the biometric contours needed. The system then compares those contours to a watchlist formed for the specific deployment.

Watchlists are formed from images retained on the SWP’s database and comprise people wanted on warrants and suspected of committing an offence through to vulnerable persons. When the AFR system highlights a possible match this is communicated to an officer who reviews the match to ensure intervention is justified. This is then communicated to intervention officers, who use a traffic light system to address each match.

AFR Locate processes large amounts of data:

  1. facial images;
  2. facial features (i.e. biometric data);
  3. metadata, including time and location; and
  4. information as to matches with persons on a watchlist.

AFR retains racial alerts for 24 hours, matches reports and the CCTV live feeds for 31 days and immediately deletes all other elements of data.

The Convention rights claim

This part of the case concerns the right to a private and family life enshrined in Article 8 ECHR:

“1. Everyone has the right to respect for his private and family life, his home and his correspondence.

  1. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”

Article 8(1) interference

The Court noted the propensity of AFR to infringe privacy rights on the grounds that it automatically, without consent, collects the biometric data of a wide range of members of the public and then processes this data. This brings the functions of AFR into the broad reach of Article 8(1), which encompasses a right to a persons’ image (S v. United Kingdom (2009) 48 EHRR 50, at [66]; Von Hannover v. Germany 40 EHRR 1, at [50]).

Even though the simple taking of an image has been held not to interfere with Article 8(1) the mere storing of data relating to the private life of an individual could.

In referencing S v. United Kingdom the Court noted the multiple similarities between the principles underpinning the cases, as S involved the lawfulness of the police force retaining biometric data in the form of fingerprint and DNA samples.

Further, the instantaneous nature of the processing and short storage of processing did not have a bearing on this. The fact of processing biometric data in and of itself was sufficient to cause interference with Article 8(1) rights. The Surveillance Camera Commissioner’s AFR Guidance was cited persuasively as the “potential for intrusion arising from AFR is arguably consistent with that arising from some forms of covert surveillance tactics and capabilities”. Further, even those whose images were stored on a watchlist could be considered to have their Article 8(1) rights infringed.

Article 8(2) compliance

Was use of AFR in accordance with the law?

In considering whether the SWP’s use of the AFR is in accordance with the law the Court first highlighted that there is no explicit legislative lawful basis for use. Instead, the use of AFR relies upon well-established common law principles.

In particular, the court considered the deemed lawful taking of photographs by police officers as in R (Wood) v Commissioner of Police of the Metropolis [2010] 1 WLR 123 and R (Catt) v Association of Chief Police Officers [2015] AC 1065. The central premise for the lawful taking of such photographs being for the maintenance of public order and the prevention and detection of crime.

It is this central purpose which underpins the justification for the lawfulness of AFR in this instance. Applying Hellewell v Chief Constable of Derbyshire [1995] 1 WLR 804 at 810F the Court considered that such use must fall within the purview of being reasonable.

The Court noted that legislative mandates are required for the interference with individuals rights, such as what would otherwise be assault in obtaining DNA swabs and taking fingerprints. In the Court’s opinion the use of AFR is less invasive in this sense, allowing for reliance upon common law powers to be permissible. The Court, therefore, equates the use of AFR to CCTV usage.

Therefore, the Court sought to place the use of AFR within the existing legal framework mandating police officers using similar techniques:

“The fact that a technology is new does not mean that it is outside the scope of existing regulation, or that it is always necessary to create a bespoke legal framework for it. The legal framework within which AFR Locate operates comprises three elements or layers (in addition to the common law), namely: (a) primary legislation; (b) secondary legislative instruments in the form of codes of practice issued under primary legislation; and (c) SWP’s own local policies.” – p[84]

The Court, therefore, considered, in turn, the s35-42 DPA 2018, the Surveillance Camera Code of Practice and SWP’s Standard Operating Procedure, Deployment Reports and Policy on Sensitive Processing. Cumulatively these provided a framework for the legal underpinning of the use of AFR, ensuring that regulatory guidelines were in place to sufficiently limit the inference with Article 8(1) rights.

Could the interference with Article 8(1) rights be justified?

The Court considered the seminal case of Bank Mellat v Her Majesty’s Treasury (No 2) [2014] AC 700, namely:

  1. whether the objective of the measure pursued is sufficiently important to justify the limitation of a fundamental right;
  2. whether it is rationally connected to the objective;
  3. whether a less intrusive measure could have been used without unacceptably compromising the objective; and
  4. whether, having regard to these matters and to the severity of the consequences, a fair balance has been struck between the rights of the individual and the interests of the community.

The police’s use of technology in the prevention and detection of crime is well documented as being in compliance with the first two criterion. The Court applied strict scrutiny to adherence with the third and fourth criteria.

These were considered to be made out on the facts; AFR was used for a limited time, specific purpose, covered a limited area and led to the detection of criminality. In the instance of the Defence Exhibition, disruption had happened at the previous years’ event and an individual who had made a bomb threat was detected by the system. The Court considered the safety of the public, the lack of impact on the Claimant (as they were not on a watchlist), the targeted nature of the watchlist and the success of AFR (the use of the software in 50 instances had resulted in some 37 arrests or disposals).

The Data Protection Claims

The Court considered the instances of AFR use as if the DPA 2018 had come into force despite the events themselves happening prior to this. This widens the scope of the judgment to covering the highly determinative legislation. The Court considered three grounds here:

  1. the claim under the DPA 1998;
  2. the claim under section 34 of the DPA 2018; and
  3. the claim under section 64 of the DPA 2018.

The DPA 1998

Section 4(4) DPA is at issue here; the adherence to the data protection principles in relation to all personal data collected. The first data protection principle was at issue here:

“personal data shall be processed fairly and lawfully and in particular, shall not be processed unless-

(a) at least one of the conditions in schedule 2 is met, and

(b) in the case of sensitive personal data, at least one of the conditions in schedule 3 is also met.”

The extent to which personal data was processed was at issue here. The Court addresses here whether AFR indirectly identifies individuals or individuates. The Court utilized a wide definition of indirect identification advocated in Breyer v Bundesrepublik Deutschland (Case C-582/14). As to individuation, the Court applied the definition endorsed by Vidal-Hall v Google Inc. [2016] QB 1003 relating to browser generated data; that the process singles out and distinguishes an individual from others.

The Court here, unsurprisingly, determines that AFR individuates people. The creation of the facial contours it develops identifies an individual, distinguishing them from others and enabling almost immediate identification.

Despite the processing of personal data individuating people, the Court considered how it had determined AFR’s processes to be lawful and in accordance with the law in relation to the Article 8 issues. How then could it determine that it was not lawful in this instance? The use of AFR was utilized for the purpose of detecting and preventing crime and for specified limited reasons ensuring its processing was in adherence with the first data protection principle, processing data lawfully and fairly.

Section 34 DPA 2018

The Claimant contended that AFR undertakes sensitive processing of biometric data of individuals and in the process of doing so identifies individuals. The Respondent contended that identified individuals should be limited to only those on the watchlist.

The Court agreed with the Claimant in this instance- each individual must have their biometric data processed and be identified to differentiate them or otherwise from those on the watchlist. AFR takes a digital image, applies a mathematical algorithm to it to produce a comparable biometric template. This brought processing within s35(8)(b) of the DPA 2018 as processing biometric data for the purposes of uniquely identifying an individual.

Accordingly, use of AFR had to comply with the three requirements in s35(5):

“(5) The second case is where —

(a) the processing is strictly necessary for the law enforcement purpose,

(b) the processing meets at least one of the conditions in Schedule 8 (necessity), and

(c) at the time when the processing is carried out, the controller has an appropriate policy document in place (see section 42).”

Criterion (a) and (b) had already been made out in relation to the Article 8 claim. In relation to condition (c) the SWP had formulated a Policy on Sensitive Processing for Law Enforcement Purposes. On the adequacy of this document, the Court remained speculative and it remain for the Information Commissioner to provide guidance as to the content of such documents.

Section 64 DPA 2018

This is the requirement that a data protection impact assessment be undertaken. The SWP had provided an assessment which the Court subjected to reasonable inquiry in compliance with the public sector precedence from R (Unison) v Lord Chancellor [2016] ICR 1. The Court considered that the assessment set out a clear narrative specifically considering Article 8 rights whilst establishing safeguards.

The public-sector equality duty claim

This claim relies upon s149(1) of the Equality Act 2010 which prescribes that public authorities are required to put in place systems to eliminate discrimination, advance equality of opportunity and foster good relations between those who have protected characteristics.

In adherence to these requirements, the SWP had prepared an Equality Impact Assessment – Initial Assessment, showing it had considered its obligations at an early stage. This was critiqued by the Claimant on the basis that it failed to consider that the AFR software may produce results which were indirectly discriminatory due to results that state it is more likely to falsely match female and minority ethnic faces.

The Court noted that there is no firm evidence that suggests AFR produces indirectly discriminatory results. This is a less tangible argument as it relies upon expert evidence that the algorithms and datasets underpinning AFR had the tendency to create these risk factors. Here, the Court stressed the intangibly of the evidence and puts great stock in the safeguard of having an officer make their own determination of any match the system provides. Accordingly, the Claimants arguments as to the equality ground failed.

Therefore, the Claimant’s claim for judicial review was dismissed on all grounds.


Comment

Given previous caselaw on the use of police powers, this judgment is perhaps unsurprising. The oft referred to S case provides a framework for the decisions in Bridges due to it considering advancements in technology in the form of fingerprinting. The primary elements of the case reside in the privacy and data-oriented arguments, which do much to reflect the Court’s incremental approach to new technologies.

What is perhaps surprising is the fact that AFR can be integrated into law enforcement activities without the establishment of separate regulatory guidelines. This is instructive of how flexible the common law provisions are and the hard stop of interference with the person. This is perhaps less surprising when it is highlighted that the establishment of the DPA 2018 was set out with such developments in mind. However, the ascendance to what some may consider being “Big Brother” styles of surveillance came rather easily in light of pre-existing frameworks and guidelines.

The advanced and invasive nature of AFR in harvesting and generating biometric data was acknowledged by the Court, as was its technological complexity. Although there are many different angles which remain to be explored. What would a “vulnerable person” complaining they are on a watchlist do to undermine the arguments advanced in Bridges? A nuanced consideration of weighing the prevention and detection of crime against Article 8 rights would be welcome.

The judgment applies a broad brush to the point made by the Claimant as to the discriminatory tendencies perpetuated by the system. This is likely due to the startling lack of evidence used to substantiate the concerns. Trial groups and instructive datasets, as well as previous dummy runs, would have been necessary to show discriminatory trends in data. For much to be made of this intransigent point evidence-based methods of data collection must be the norm for any trends to be sufficiently identified. Even in this case the development and continued improvement of the system are likely to render such concerns short-lived at best but due to their importance perhaps all the more prohibtive.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s