Citation: BBC: England police to get access to NHS Test and Trace Data

The BBC has reported that the police will be granted access to Test and Trace data on a “case-by-case” basis to enforce coronavirus safety laws.

The news comes after the Government has admitted in a letter to the Open Rights Group (“ORG”) that no Data Protection Impact Assessment (“DPIA”) was undertaken in the development of its efforts to trace Covid-19 infections. Completing a DPIA is a legal requirement under the GDPR and Data Protection Act 2018. The ORG correspondence and press release can be found here.

The police will not be given access to the NHS Covid-19 app and will only be given details of whether an individual has been told to self-isolate.

In this case undertaking data processing for the primary purpose of law enforcement, has its own regulatory guidelines- the ICO guidance can be found here. The classification of such data is likely to be considered as sensitive health data. As such it must be demonstrated that the processing is strictly necessary and satisfy one of the conditions in the Data Protection Act 2018, Schedule 8 or is based on consent.

It remains to be seen what framework will be developed to ensure data protection compliance and privacy safeguards. A policy document must be in place for this type of processing to be undertaken.

YouTube faces £2bn legal action for alleged misuse of child data

A class action style law suit valued at £2bn has been filed in the High Court against Google, focusing on subsidiary YouTube’s handling of child user data.

The action alleges that YouTube collected the data of over 5 million British children without parental consent. The requirement of parental consent is enshrined in the General Data Protection Regulation and UK Data Protection Act 2018.

The claimant, privacy expert Duncan McCann, is represented by litigation specialist firm Hausfeld and supported by tech rights group Foxglove.

See coverage from the BBC and Business Wire.

Citation: UK Labour Law Blog: Surveillance and working from home during covid-19

The UK Labour Law Blog has an excellent piece examining how surveillance of employees in the workplace conflicts with the right to privacy in the covid work from home environment.

Phillipa Collins, Lecturer of Law at the University of Exeter, examines the privacy issues bought into play by increased working for home arrangements where the traditionally private space of the home meets working life.

Cited with compliments: (P Collins, ‘The Right to Privacy, Surveillance-by-Software and the “Home-Workplace”’, UK Labour Law Blog, 3 September 2020, available at https://uklabourlawblog.com)

Police’s use of facial recognition software found unlawful in Court of Appeal – R v Bridges

The case of R v Bridges [2020] EWCA Civ 1058 continues in the Court of Appeal with the finding of the Divisional Court being challenged on five grounds as set out below.

It should be noted that of these grounds the appeal was allowed against grounds 1, 3 and 5.

The case concerned the lawfulness of South Wales Police using Automated Facial Recognition (“AFR”) software on two occasions. The lawfulness of the police’s actions had been upheld at Divisional Court level. The case therefore has implications for the roll out of facial recognition software by the police on a national basis.

R v Bridges concerns the South Wales police’s trial of facial recognition software, a matter highly contested by privacy advocates

Ground 1: The Divisional Court erred in concluding that the interference with the Appellant’s rights under Article 8(1) of the Convention, taken with section 6 of the HRA 1998, occasioned by SWP’s use of AFR on 21 December 2017 and 27 March 2018 and on an ongoing basis, was/is in accordance with the law for the purposes of Article 8(2).

This Ground was upheld as it found that there was no sufficient legal framework for the use of AFR. In particular it was found that:

“It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR can be deployed… too much discretion is left to individual police officers” at [91].

The Surveillance Camera Code of Practice was insufficient as neither who was on the watchlist or where AFR may be deployed was sufficiently defined, “two critical defects in the legal framework” at [120].

South Wales Police’s policies did not provide sufficient prescriptive requirements either merely stating in its Data Protection Impact Assessment:

“As we are testing the technology South Wales Police have
deployed in all event types ranging from high volume music and
sporting events to indoor arenas.”

This was merely descriptive and very broad ranging.

Thus the use of AFR locate was not in accordance with the law. The first limb of the two stage test for Article 8(2) compliance was not met.


Ground 2: The Divisional Court made an error of law in assessing whether SWP’s use of AFR at the December 2017 and March 2018 deployments constituted a proportionate interference with Article 8 rights within Article 8(2). The Divisional Court failed to consider the cumulative interference with the Article 8 rights of all those whose facial biometrics were captured as part of those deployments.

The Court rejected this submission finding that “the balancing exercise which the principle of proportionality requires is not a mathematical one; it is an exercise which calls for judgement.” at [143].


Ground 3: The Divisional Court was wrong to hold that SWP’s DPIA complied with the requirements of section 64 of the DPA 2018. The DPIA is based on two material errors of law concerning the (non)engagement of the rights in Article 8 of the Convention and the processing of the (biometric) personal data of persons whose facial biometrics are captured by AFR but who are not on police watchlists used for AFR.

Given its finding at Ground 1 the Court of Appeal found that the DPIA completed by the police was in and of itself insufficient to comply with the requirements of s.64(3)(b) and (c) of the DPA 2018. It did not sufficiently anticipate the issues of who or where questions for how AFR Locate was used sufficient to enable its use to be in accordance with the law.


Ground 4: The Divisional Court erred in declining to reach a conclusion as to whether SWP has in place an “appropriate policy document” within the meaning of section 42 of the DPA 2018 (taken with section 35(5) of the DPA 2018), which complies with the requirements of that section. Having in place such a document is a condition precedent for compliance with the first data protection principle (lawful and fair processing) contained in section 35 of the DPA 2018 where the processing of personal data constitutes “sensitive processing” within the meaning of section 35(8) of the DPA.

It was found that it was entirely sufficient and appropriate for the Divisional Court to left the further expansion of the policy to the SWP following advisement by the ICO. It was considered that the policy was barely sufficient to meet the s.42 requirements but it was within the Court’s discretion to leave its adjustment to the SWP and subsequent commentary by the Information Commissioner.


Ground 5: The Divisional Court was wrong to hold that SWP complied with the Public Sector Equality Duty in circumstances in which SWP’s Equality Impact Assessment was obviously inadequate and was based on an error of law (failing to recognise the risk of indirect discrimination) and SWP’s subsequent approach to assessing possible indirect discrimination arising from the use of AFR is flawed. It is argued that the Divisional Court failed in its reasoning to appreciate that the PSED is a continuing duty.

The Court of Appeal highlighted that this submission concerned the ongoing public obligations of the SWP to ensure the processes it used were not discriminatory. The fact that the matching process itself was automated but there was in fact a human failsafe, an officer who check each positove match before intervention, was not sufficient to meet the PSED.

Ultimately “the fact remains, however, that SWP have never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex.” At [199]

Further guidance was provided by the Court as to the standard for adherence-

“We would hope that, as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.” at [201]

The South Wales Police have not yet confirmed if they will appeal the finding.

Parts of Meghan Markle’s claim against Associated Newspapers struck out following preliminary hearing

On 1 May 2020 Mr Justice Warby handed down judgment concerning a pre-trial application by Associated Newspapers in its ongoing defence of claims of misuse of private information, copyright infringement, and breach of data protection rights by Meghan Markle, HRH The Duchess of Sussex. Continue reading

Privacy protection in practice: The coronavirus and healthcare data

TTP extends its best wishes to all those impacted by the coronavirus and hopes that all are safe and well. For those readers based in the UK the NHS coronavirus guidance can be found here and Government guidance here. Stay home, stay safe.   Continue reading

Citation: Wired: The privacy of crypto-currency

Wired has published an insightful article on virtual currencies.

The article considers the privacy implications of crypto-currency transactions. It highlights the issues surrounding logging each transaction in a publically available manner and concerns around behavioural modelling.

The article considers the providers Monero and Zcash in particular.