Citation: BBC: England police to get access to NHS Test and Trace Data

The BBC has reported that the police will be granted access to Test and Trace data on a “case-by-case” basis to enforce coronavirus safety laws.

The news comes after the Government has admitted in a letter to the Open Rights Group (“ORG”) that no Data Protection Impact Assessment (“DPIA”) was undertaken in the development of its efforts to trace Covid-19 infections. Completing a DPIA is a legal requirement under the GDPR and Data Protection Act 2018. The ORG correspondence and press release can be found here.

The police will not be given access to the NHS Covid-19 app and will only be given details of whether an individual has been told to self-isolate.

In this case undertaking data processing for the primary purpose of law enforcement, has its own regulatory guidelines- the ICO guidance can be found here. The classification of such data is likely to be considered as sensitive health data. As such it must be demonstrated that the processing is strictly necessary and satisfy one of the conditions in the Data Protection Act 2018, Schedule 8 or is based on consent.

It remains to be seem what framework will be developed to ensure data protection compliance and privacy safeguards. A policy document must be in place for this type of processing to be undertaken.

YouTube faces £2bn legal action for alleged misuse of child data

A class action style law suit valued at £2bn has been filed in the High Court against Google, focusing on subsidiary YouTube’s handling of child user data.

The action alleges that YouTube collected the data of over 5 million British children without parental consent. The requirement of parental consent is enshrined in the General Data Protection Regulation and UK Data Protection Act 2018.

The claimant, privacy expert Duncan McCann, is represented by litigation specialist firm Hausfeld and supported by tech rights group Foxglove.

See coverage from the BBC and Business Wire.

Citation: UK Labour Law Blog: Surveillance and working from home during covid-19

The UK Labour Law Blog has an excellent piece examining how surveillance of employees in the workplace conflicts with the right to privacy in the covid work from home environment.

Phillipa Collins, Lecturer of Law at the University of Exeter, examines the privacy issues bought into play by increased working for home arrangements where the traditionally private space of the home meets working life.

Cited with compliments: (P Collins, ‘The Right to Privacy, Surveillance-by-Software and the “Home-Workplace”’, UK Labour Law Blog, 3 September 2020, available at https://uklabourlawblog.com)

Police’s use of facial recognition software found unlawful in Court of Appeal – R v Bridges

The case of R v Bridges [2020] EWCA Civ 1058 continues in the Court of Appeal with the finding of the Divisional Court being challenged on five grounds as set out below.

It should be noted that of these grounds the appeal was allowed against grounds 1, 3 and 5.

The case concerned the lawfulness of South Wales Police using Automated Facial Recognition (“AFR”) software on two occasions. The lawfulness of the police’s actions had been upheld at Divisional Court level. The case therefore has implications for the roll out of facial recognition software by the police on a national basis.

R v Bridges concerns the South Wales police’s trial of facial recognition software, a matter highly contested by privacy advocates

Ground 1: The Divisional Court erred in concluding that the interference with the Appellant’s rights under Article 8(1) of the Convention, taken with section 6 of the HRA 1998, occasioned by SWP’s use of AFR on 21 December 2017 and 27 March 2018 and on an ongoing basis, was/is in accordance with the law for the purposes of Article 8(2).

This Ground was upheld as it found that there was no sufficient legal framework for the use of AFR. In particular it was found that:

“It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR can be deployed… too much discretion is left to individual police officers” at [91].

The Surveillance Camera Code of Practice was insufficient as neither who was on the watchlist or where AFR may be deployed was sufficiently defined, “two critical defects in the legal framework” at [120].

South Wales Police’s policies did not provide sufficient prescriptive requirements either merely stating in its Data Protection Impact Assessment:

“As we are testing the technology South Wales Police have
deployed in all event types ranging from high volume music and
sporting events to indoor arenas.”

This was merely descriptive and very broad ranging.

Thus the use of AFR locate was not in accordance with the law. The first limb of the two stage test for Article 8(2) compliance was not met.


Ground 2: The Divisional Court made an error of law in assessing whether SWP’s use of AFR at the December 2017 and March 2018 deployments constituted a proportionate interference with Article 8 rights within Article 8(2). The Divisional Court failed to consider the cumulative interference with the Article 8 rights of all those whose facial biometrics were captured as part of those deployments.

The Court rejected this submission finding that “the balancing exercise which the principle of proportionality requires is not a mathematical one; it is an exercise which calls for judgement.” at [143].


Ground 3: The Divisional Court was wrong to hold that SWP’s DPIA complied with the requirements of section 64 of the DPA 2018. The DPIA is based on two material errors of law concerning the (non)engagement of the rights in Article 8 of the Convention and the processing of the (biometric) personal data of persons whose facial biometrics are captured by AFR but who are not on police watchlists used for AFR.

Given its finding at Ground 1 the Court of Appeal found that the DPIA completed by the police was in and of itself insufficient to comply with the requirements of s.64(3)(b) and (c) of the DPA 2018. It did not sufficiently anticipate the issues of who or where to enable the use of AFR to be in accordance with the law.


Ground 4: The Divisional Court erred in declining to reach a conclusion as to whether SWP has in place an “appropriate policy document” within the meaning of section 42 of the DPA 2018 (taken with section 35(5) of the DPA 2018), which complies with the requirements of that section. Having in place such a document is a condition precedent for compliance with the first data protection principle (lawful and fair processing) contained in section 35 of the DPA 2018 where the processing of personal data constitutes “sensitive processing” within the meaning of section 35(8) of the DPA.

It was found that it was entirely sufficient and appropriate for the Divisional Court to left the further expansion of the policy to the SWP following advisement by the ICO. It was considered that the policy was barely sufficient to meet the s.42 requirements but it was within the Court’s discretion to leave its adjustment to the SWP and subsequent commentary by the Information Commissioner.


Ground 5: The Divisional Court was wrong to hold that SWP complied with the Public Sector Equality Duty in circumstances in which SWP’s Equality Impact Assessment was obviously inadequate and was based on an error of law (failing to recognise the risk of indirect discrimination) and SWP’s subsequent approach to assessing possible indirect discrimination arising from the use of AFR is flawed. It is argued that the Divisional Court failed in its reasoning to appreciate that the PSED is a continuing duty.

The Court of Appeal highlighted that this submission concerned the ongoing public obligations of the SWP to ensure the processes it used were not discriminatory. The fact that the matching process itself was automated but there was im fact a human failsafe, an officer who check each positove match before intervention, was not sufficient to meet the PSED.

Ultimately “the fact remains, however, that SWP have never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex.” At [199]

Further guidance was provided by the Court as to the standard for adherence-

“We would hope that, as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.” at [201]

The South Wales Police have not yet confirmed if they will appeal the finding.

The Schrems II case- EU-US data transfers left in question

The European Court of Justice has handed down its highly anticipated ruling in the Schrems II case. The case considered the validity of the EU-US Privacy Shield and the efficacy of Standard Contractual Clauses (“SCC”) as data transfer protection mechanisms.

In this landmark case it was found that the EU Commission’s adequacy decision around the EU-US Privacy Shield framework was invalid. The leaves the mechanism for conducting EU-US data transfers in question. This matter maybe covered by recent discussions between the UK and US around entering into a seperate data sharing agreement. However, in the interim a transitional mechanism is sorely needed alongside guidance for data processors to give clarity to how data sharing between the countries can be regulated and data subjects rights safeguarded.

The SCC regime was affirmed to be valid however, it was suggested that companies and regulators enter into a case by case basis analysis of risk. In particular, it was highlighted that such an assessment should take place where government access to data is mandated. This is a highly topical issue in the US given current efforts to put in place a federal data protection regime.

For more details on the Schrems II case see-

The IAPP

INFORRM

Law firm Bird & Bird

The ICO‘s press release

UK government releases NHS covid-19 data sharing agreements

Following significant pressure from groups such as OpenDemocracy and Foxglove the UK government has released its data sharing contracts with companies such as Amazon, Google and Microsoft for the creation of a cloud database for sharing covid-19 related data. Contracts with AI firms Planatir and Faculty were also released.

This promotes transparency and accountability around efforts to establish contract tracing technology and centralised databases to combat covid-19. The potential access to high volumes of healthcare data via these databases merits high levels of scrutiny under privacy and data protection laws. However, groups such as openDemocracy raised concerns around sharing high volumes of NHS data and the risk posed by significant third party exposure. In particular, it criticized the credibility of AI firms Planatir and Faculty.

In a recent press release from openDemocracy the contracts were made public:

View Google NHS agreements (PDF, 0.7 MB)

View Faculty NHS agreements (PDF, 0.9 MB)

View Palantir NHS agreements (PDF, 11.6 MB)

View Microsoft NHS agreements (PDF, 1.5 MB)

NHS England has also released the Data Protection Impact Assessment which was undertaken prior to forming a centralised data storage facility for covid-19 related data. This database holds data ranging from regional infection maps to 911 call data and bed capacities.

The NHS uses a ‘cloud first’ approach to ensuring that data is leveraged most effectively. All data is collated in a cloud database allowing for security and accessibility.

Parts of Meghan Markle’s claim against Associated Newspapers struck out following preliminary hearing

On 1 May 2020 Mr Justice Warby handed down judgment concerning a pre-trial application by Associated Newspapers in its ongoing defence of claims of misuse of private information, copyright infringement, and breach of data protection rights by Meghan Markle, HRH The Duchess of Sussex. Continue reading

Privacy protection in practice: The coronavirus and healthcare data

TTP extends its best wishes to all those impacted by the coronavirus and hopes that all are safe and well. For those readers based in the UK the NHS coronavirus guidance can be found here and Government guidance here. Stay home, stay safe.   Continue reading