Police’s use of facial recognition software found unlawful in Court of Appeal – R v Bridges

The case of R v Bridges [2020] EWCA Civ 1058 continues in the Court of Appeal with the finding of the Divisional Court being challenged on five grounds as set out below.

It should be noted that of these grounds the appeal was allowed against grounds 1, 3 and 5.

The case concerned the lawfulness of South Wales Police using Automated Facial Recognition (“AFR”) software on two occasions. The lawfulness of the police’s actions had been upheld at Divisional Court level. The case therefore has implications for the roll out of facial recognition software by the police on a national basis.

R v Bridges concerns the South Wales police’s trial of facial recognition software, a matter highly contested by privacy advocates

Ground 1: The Divisional Court erred in concluding that the interference with the Appellant’s rights under Article 8(1) of the Convention, taken with section 6 of the HRA 1998, occasioned by SWP’s use of AFR on 21 December 2017 and 27 March 2018 and on an ongoing basis, was/is in accordance with the law for the purposes of Article 8(2).

This Ground was upheld as it found that there was no sufficient legal framework for the use of AFR. In particular it was found that:

“It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR can be deployed… too much discretion is left to individual police officers” at [91].

The Surveillance Camera Code of Practice was insufficient as neither who was on the watchlist or where AFR may be deployed was sufficiently defined, “two critical defects in the legal framework” at [120].

South Wales Police’s policies did not provide sufficient prescriptive requirements either merely stating in its Data Protection Impact Assessment:

“As we are testing the technology South Wales Police have
deployed in all event types ranging from high volume music and
sporting events to indoor arenas.”

This was merely descriptive and very broad ranging.

Thus the use of AFR locate was not in accordance with the law. The first limb of the two stage test for Article 8(2) compliance was not met.


Ground 2: The Divisional Court made an error of law in assessing whether SWP’s use of AFR at the December 2017 and March 2018 deployments constituted a proportionate interference with Article 8 rights within Article 8(2). The Divisional Court failed to consider the cumulative interference with the Article 8 rights of all those whose facial biometrics were captured as part of those deployments.

The Court rejected this submission finding that “the balancing exercise which the principle of proportionality requires is not a mathematical one; it is an exercise which calls for judgement.” at [143].


Ground 3: The Divisional Court was wrong to hold that SWP’s DPIA complied with the requirements of section 64 of the DPA 2018. The DPIA is based on two material errors of law concerning the (non)engagement of the rights in Article 8 of the Convention and the processing of the (biometric) personal data of persons whose facial biometrics are captured by AFR but who are not on police watchlists used for AFR.

Given its finding at Ground 1 the Court of Appeal found that the DPIA completed by the police was in and of itself insufficient to comply with the requirements of s.64(3)(b) and (c) of the DPA 2018. It did not sufficiently anticipate the issues of who or where questions for how AFR Locate was used sufficient to enable its use to be in accordance with the law.


Ground 4: The Divisional Court erred in declining to reach a conclusion as to whether SWP has in place an “appropriate policy document” within the meaning of section 42 of the DPA 2018 (taken with section 35(5) of the DPA 2018), which complies with the requirements of that section. Having in place such a document is a condition precedent for compliance with the first data protection principle (lawful and fair processing) contained in section 35 of the DPA 2018 where the processing of personal data constitutes “sensitive processing” within the meaning of section 35(8) of the DPA.

It was found that it was entirely sufficient and appropriate for the Divisional Court to left the further expansion of the policy to the SWP following advisement by the ICO. It was considered that the policy was barely sufficient to meet the s.42 requirements but it was within the Court’s discretion to leave its adjustment to the SWP and subsequent commentary by the Information Commissioner.


Ground 5: The Divisional Court was wrong to hold that SWP complied with the Public Sector Equality Duty in circumstances in which SWP’s Equality Impact Assessment was obviously inadequate and was based on an error of law (failing to recognise the risk of indirect discrimination) and SWP’s subsequent approach to assessing possible indirect discrimination arising from the use of AFR is flawed. It is argued that the Divisional Court failed in its reasoning to appreciate that the PSED is a continuing duty.

The Court of Appeal highlighted that this submission concerned the ongoing public obligations of the SWP to ensure the processes it used were not discriminatory. The fact that the matching process itself was automated but there was in fact a human failsafe, an officer who check each positove match before intervention, was not sufficient to meet the PSED.

Ultimately “the fact remains, however, that SWP have never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex.” At [199]

Further guidance was provided by the Court as to the standard for adherence-

“We would hope that, as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.” at [201]

The South Wales Police have not yet confirmed if they will appeal the finding.

Privacy concerns around Amazon’s Ring

“A home security product upscaled and diversified into law enforcement and integrated with facial recognition software brings with it some serious privacy concerns.”

What is the Ring?

door wooden bell old

The Ring is Amazon’s bestselling smart security device product line. The most notable of which is the Ring doorbell which allows users to monitor movement by their front doors, video and receive mobile notifications whenever someone presses the doorbell. Users can also benefit from an App which is installed on their mobile, monitors local news and allows social media style sharing with other Ring users.

Ring additionally offers security services, cross-selling into the wider security service market.

Ring and law enforcement  

Recent controversy was sparked when it was found that the Ring in partnering with over 400 police departments in the United States. The extent of the Ring’s collaborative efforts extend to targeting ad words to users encouraging that they share live video feed footage with law enforcement. This in and of itself is a significant extension in police surveillance meriting further legislative scrutiny.

However, pair this with the fact that the Ring may integrate and encourage the use of the Amazon’s Rekognition facial recognition software product and the companies dubbing of the service as “the new neighborhood watch”- it becomes all the more disconcerting.

It is well-established that people’s likeness is considered personal data and that the recording of individuals without their consent is potentially invasive. There are also civil liberties concerns regarding the police acquiring these live video feeds for their own use.

This has drawn the attention of the Senator for Massachusetts, Edward Markey, who recently published a letter sent to Amazons CEO Jeffery Bezos, highlighting civil liberties concerns with the Ring. This highlights issues previously raised in the United Kingdom in relation to the use of facial recognition software; its potential to racially profile individuals. Whilst this was considered by the Administrative Court to be too an intangible argument lacking sufficient supporting data, further scrutiny would be most welcome.

And it looks like further scrutiny seems forthcoming. In his letter Senator Markey highlights 10 key concerns around the Ring system, demanding a response from the Amazon CEO by 26 September 2019. We highly recommend readers consider the letter in its entirety here.

The privacy implications of using facial recognition software

The use of facial recognition software (“FRS”) in security and monitoring was thrust into the spotlight by the London Mayor Sadiq Khan, taking issue with a London developer over its installation in a King’s Cross site. In this post on the Privacy Perspective we consider the privacy and data protection issues with integrating FRS into security systems, an issue currently before the courts. Continue reading

Big brother is watching you, in compliance with the European Convention of Human Rights

Revisiting the case of Big Brother Watch and Others v. the United Kingdom

The operation of the UK’s surveillance services, MI5, MI6, GCHQ and the Metropolitan Police Service and their interaction with human rights (“Convention rights”) have historically been obscure to safeguard the interests of national security. The specifics of policy and practices when conducting national surveillance and its interaction with the private lives citizens have only come to light since the whistleblowing of Edward Snowden in 2013, catalyzing closer scrutiny of their potential to impinge upon the democratic freedoms.

Continue reading