Top 10 Privacy and Data Protection Cases 2022

Inforrm covered a wide range of data protection and privacy cases in 2022. Following my posts in 20182019,  2020 and 2021 here is my selection of notable privacy and data protection cases across 2022.

  1. ZXC v Bloomberg [2022] UKSC 5

This was the seminal privacy case of the year, decided by the UK Supreme Court. It was considered whether, in general a person under criminal investigation has, prior to being charged, a reasonable expectation of privacy in respect of information relating to that investigation.

Continue reading

Festive wishes from TPP

We would like to thank all our readers and subscribers for visiting TPP over the past year. Many thanks also to our contributors across the past year for their insight and expertise.

We are currently working on getting more informative pieces on privacy to you- including a series on what privacy law is like to practice as a professional (if you would like to contribute be sure to let us know) and our traditional Top 10 cases of the year across defamation, privacy law and data protection in association with the esteemed International Forum for Responsible Media Blog.

In the meantime, if any of our readers would like to guest write for us we encourage you to get in touch- we always welcome the opportunity to work with you.

Our case quote of the year is from the seminal case that was heard before the UK Supreme Court, ZXC v Bloomberg [2022] UKSC 5, finding that, as a legitimate starting point, criminal suspects have a reasonable expectation of privacy in the fact of an investigation at pre-charge stage:

…whether there is a reasonable expectation of privacy in the relevant information is a fact-specific enquiry which requires the evaluation of all circumstances in the individual case… We consider that the courts below were correct in articulating such a legitimate starting point to the information in this case. This means that once the claimant has set out and established the circumstances, the court should commence its analysis by applying the starting point.

[And, as such:]

The courts below were correct to hold that, as a legitimate starting point, a person under criminal investigation has, prior to being charged, a reasonable expectation of privacy in respect of information relating to that investigation and that in all the circumstances this is a case in which that applies and there is such an expectation.

at p.144 and 146 from Lord Hamblen and Lord Stephens

See our comment on the case for more information.

A very happy Christmas and New Year to you all.

The Privacy Perspective Founder and Editor, Suneet Sharma

The Online Safety Bill: Everything in Moderation? – Naomi Kilcoyne – Parts I and II

A number of Bills proposed by the recent Conservative governments have sparked controversy among commentators: among them, the Northern Ireland Protocol Bill, the Retained EU Law Bill, and the ‘British’ Bill of Rights Bill. Taking its place in the rogues’ gallery is the Online Safety Bill (OSB).

Now returning to the House of Commons on 5 December 2022 to finish its Report Stage, the OSB has come some way since the ‘Online Harms’ White Paper published in April 2019. The Bill raises important questions about freedom of expression, online speech regulation and government (over)reach.

This article has four principal components.

Part I lays out the content and objectives of the Bill, highlighting its legislative development and the key issues arising from that. Part II situates the Bill within the wider context of online regulation, considering how recent developments may inflect the Bill’s impact.

This provides the framework for Part III, which addresses the various criticisms that the Bill has received from commentators across political spectrum. Part IV then examines the broader legal and theoretical consequences of the Bill, posing further questions to be answered. Some conclusions are drawn in Part V.

An appended Part VI briefly outlines the most recent updates to the Bill.

PART I: CONTENT

Much of the OSB’s content was clarified by the Commons Digital, Culture, Media and Sport (DCMS) Committee Report in January 2022, and the Government’s Response to this in March 2022.

As these reports confirmed, the main priority of the OSB is evident from its name change. Now couched in broader terms, the Bill is designed to protect internet users’ online safety by way of three central objectives (Response, at [2]):

  1. To tackle illegal content and activity.
  2. To deliver protection for children online.
  3. To afford adults greater control, while protecting freedom of expression.

To achieve these objectives, the Bill operates on a duty of care model. Under this model, online platforms are liable only for their own conduct: the Bill seeks to hold platforms responsible for systemic ‘lack of diligence in failing to adopt preventive or remedial measures’ (Report, at [7]). This is, in theory, a less stringent regulatory model than ‘intermediary liability’, under which online platforms would also be liable for others’ content and activity.

Moreover, service providers will not owe a limitless duty of care (Report, at [4]). Instead, the Bill divides providers into various categories, which in turn are subject to specific duties. For example, Category 1 (high-risk and high-reach, user-to-user) services are deemed to be the largest and most risky, so incur additional duties as compared to Categories 2A (all regulated search services) and 2B (the remaining regulated user-to-user services).

Enforcement of such duties lies not with the government, but with the regulatory authority Ofcom, to which the legislation grants overseeing and enforcing powers (Response, at [3]).

Central to the Bill’s duty of care model is its typology of online content. Initially, the OSB distinguished illegal from legal material, the latter of which it subdivided into two – producing three content typologies to align with the Bill’s stated objectives:

  1. Illegal content
  2. Legal but harmful content
    1. Content that is harmful to children
    1. Content that is harmful to adults (for Category 1 services)

The Bill originally defined each type of content as follows (Report, at [5]):

  • Illegal content: content whose use / dissemination constitutes a relevant offence
  • Content harmful to children and adults:
    • Designated – content of a type designated in regulations made by the Secretary of State
    • Non-designated – content which fulfils one of the general definitions
      • These apply where the provider has reasonable grounds to believe that there is a material risk of the content having (even indirectly) a significant adverse physical / psychological impact on a child or adult (as applicable) of ordinary sensibilities.

These definitions were essential to the Bill’s regulatory framework, since they directly underpinned the associated risk assessment and safety duties (Report, at [6]). Simply put, how content is defined determines what a provider is required (or not) to do about it. The lower the definitional bar, the more content is subject to regulation – and, potentially, removal.

While illegal content has certainly provoked discussion, controversy has principally surrounded the ‘legal but harmful’ debate. The regulation of such content begs the question: can moderation be justified where the content, by its nature, does not meet the criminal standard?

Of particular interest are the Government’s subsequent amendments to the draft Bill, following the DCMS Report. Despite accepting eight of the Committee’s recommendations, the Government’s Response stated in the legal but harmful context that ‘rather than using the Committee’s proposed reframing, we have made other changes that meet a similar objective’ (at [29]).  

As the Bill stood in March 2022, the Government had amended its position in the following key areas:

  1. Definition of ‘harmful’ – This was simplified under the revised Bill: content had to present a material risk of significant harm to an appreciable number of children/adults (Response, at [30]). The key threshold to engage safety duties was one of ‘priority’ harmful content.
  • Designation of types of harmful content – As envisaged in the draft Bill, priority content harmful to children and adults was to be designated by the Secretary of State in secondary legislation, following consultation with Ofcom. This would now be subject to the affirmative resolution procedure, to maximise parliamentary scrutiny (Response, at [12], [55]-[57]). The government also published an indicative list of what might be designated under the Bill as priority harmful content.
  • Non-priority content harmful to adults – The revised Bill removed the obligation upon service providers to address non-priority content harmful to adults. Companies were required only to report its presence to Ofcom (Response, at [31], [40]).

According to a Ministerial Statement released in July 2022, service providers’ safety duties regarding ‘legal but harmful’ content could thus be broken down as follows:

  1. Children – Primary priority content harmful to children
    1. Services must prevent children from encountering this type of content altogether
  • Children – Priority content harmful to children
    • Services must ensure content is age-appropriate for their child users
  • Adults – Priority content harmful to adults
    • Applies only to Category 1 services
    • These must address such content in their terms and conditions, but may set their own tolerance: this may range from removing such content, to allowing it freely.

PART II: CONTEXT

To understand the ‘legal but harmful’ debate more fully, we must situate the OSB in context.

Europe:

In the EU, the recently adopted Digital Services Act (DSA) shares some similarities with the OSB: both provide a legal framework for online platforms’ duties regarding content moderation.

However, Dr Monica Horten has identified the following distinctions:

  • The DSA focuses on regulating illegal rather than merely ‘harmful’ content. In doing so, according to the non-profit Electronic Frontier Foundation, the DSA ‘avoids transforming social networks and other services into censorship tools’ – a position from which the OSB’s broader scope deviates.
  • The DSA unequivocally recognises the right to freedom of expression as guaranteed by Article 11 of the Charter of Fundamental Rights, in accordance with which service providers must act when fulfilling their obligations. The adequacy of free speech protection under the OSB may be less assured, as considered below.
  • The measures also differ in their provision of redress. While the DSA includes both prospective and retrospective procedural safeguards for users who have acted lawfully, the OSB arguably falls short – despite the Government’s assurance that users’ access to courts would not be impeded by the Bill’s ‘super-complaints mechanism’ (Response, at [18]).

It is also worth noting the proposed European Media Freedom Act (EMFA), broadly hailed as a positive step for journalistic pluralism within the EU. Granted, the OSB purports to exclude the press (‘news publishers’) from its content moderation rules. However, uncertainty remains as to the possible regulation of comments sections on newspaper websites, not to mention newspapers’ own activity on social media.

USA:

Across the Atlantic, the US courts show some signs of a legal vacuum developing around over-moderation. Recent attempts by social media users to challenge online content moderation by asserting their First Amendment rights have failed, on the basis that sites such as Facebook and Twitter are not ‘state actors’, but rather private actors not subject to constitutional claims.

As a counterpoint, the recent takeover of Twitter by Elon Musk may illustrate the risks of under-moderation. Concerns are particularly acute in light of Musk’s reinstatement of banned high-profile accounts – having stated he would wait until a new ‘content moderation council’ had convened – and his announcement of a general amnesty. This follows the removal of thousands of Twitter content moderators, and swift resurgence of hate speech and misinformation.

UK:

Returning to the UK, the wider position of freedom of expression is somewhat ambiguous.

On the one hand, the aforementioned Bill of Rights Bill (BORB) claims to improve safeguards: clause 4 requires judges to give ‘great weight’ to protecting freedom of expression. However, the former Deputy President of the Supreme Court, Lord Mance, has queried how different this is to the ‘particular regard’ provision in s 12(4) of the HRA. Other commentators have questioned whether this presumptive priority of Article 10 may in fact skew the balance in privacy actions, which rely on the presumptive parity between Articles 8 and 10. On either analysis, the BORB’s parallel statutory attempt to enshrine freedom of expression – recalling the OSB’s third objective – is not encouraging.

On the other hand, calls for greater online regulation have gained traction following the inquest into the death of the British teenager Molly Russell. The senior coroner found in October that the 14-year-old had suffered from ‘the negative effects of on-line content’, calling inter alia for ‘the effective regulation of harmful on-line content’, and for legislation ‘to ensure the protection of children’ against its effects. This offers a compelling policy argument in favour of the OSB’s second objective.

This overview of the Bill’s content and context provides the factual basis for a normative analysis of its criticisms and consequences in Parts III and IV.

Naomi Kilcoyne is a Visiting Lecturer in Public Law at City University, having completed her GDL there in 2021-22. She has a particular interest in the interplay between public and private law.

The Personal Data life cycle: Where to start the analysis? – Vladyslav Tamashev, Privacy lawyer at Legal IT Group

Have you ever thought about data on your computer? It doesn’t matter whether you are a content creator, programmer, or just a regular user thousands of different files were created, downloaded, and altered on your device. But what happens when some of that data becomes useless to you?

Usually, this data will be manually deleted to get some free space on your storage device or it will be wiped during the OS reinstallation. Everything that happened with that data starting from its creation or collection until its destruction is called the data life cycle.

The data life cycle is a sequence of stages that happened to a particular unit of data. The simplified life cycle model has 5 basic stages: Collection, Processing, Retention, Disclosure, Destruction. In practice, when we talk about personal data life cycle, this sequence can be dramatically different, dependant on the type of information, its usage, origin, company policies, personal data protection regulations and legislation.

Continue reading

Top 10 Privacy and Data Protection Cases of 2021: A selection – Suneet Sharma

Inforrm covered a wide range of data protection and privacy cases in 2021. Following  my posts in 20182019 and 2020 here is my selection of most notable privacy and data protection cases across 2021:

  1. Lloyd v Google LLC [2021] UKSC 50

 In the most significant privacy law judgment of the year the UK Supreme Court considered whether a class action for breach of s4(4) Data Protection Act 1998 (“DPA”) could be brought against Google of its obligations as a data controller for its application of the “Safari Workaround”. The claim for compensation was made under s.13 DPA 1998.  The amount claimed per person advanced in the letter of claim was £750. Collectively, with the number of people impacted by the processing, the potential liability of Google was estimated to exceed £3bn.

Lord Leggatt handed down the unanimous judgement in favour of the appellant Google LLC:

“the claim has no real prospect of success. That in turn is because, in the way the claim has been framed in order to try to bring it as a representative action, the claimant seeks damages under section 13 of the DPA 1998 for each individual member of the represented class without attempting to show that any wrongful use was made by Google of personal data relating to that individual or that the individual suffered any material damage or distress as a result of a breach of the requirements of the Act by Google.”

The case has been heralded for its central importance in determining the viability of data protection class actions. The case drew wide coverage from Pinsent MasonsHill DickinsonClifford ChanceBindmans and Stewarts.

  1. HRH The Duchess of Sussex v Associated Newspapers Limited [2021] EWHC 273 (Ch) and [2021] EWCA Civ 1810.

In February 2021 Meghan, Duchess of Sussex, won her application for summary judgment against the Mail on Sunday.  Warby LJ said there were “compelling reasons” for it not to go to trial over its publication of extracts of a private letter to her estranged father, Thomas Markle.  He entered judgment for the Duchess in misuse of private information and copyright.  There was a news piece on Inforrm and a piece by Dominic Crossley.

Associated Newspapers was granted permission appeal and the appeal was heard on 9 and 11 November 2021 with judgment being handed down on 2 December 2021,  The Court, Sir Geoffrey Vos MR, Sharp P and Bean LJ, unanimously dismissed the appeal on all grounds, stating:

“Essentially, whilst it might have been proportionate to disclose and publish a very small part of the Letter to rebut inaccuracies in the People Article, it was not necessary to deploy half the contents of the Letter as Associated Newspapers did. As the Articles themselves demonstrate, and as the judge found, the primary purpose of the Articles was not to publish Mr Markle’s responses to the inaccurate allegations against him in the People Article. The true purpose of the publication was, as the first 4 lines of the Articles said: to reveal for the first time [to the world] the “[t]he full content of a sensational letter written by [the Duchess] to her estranged father shortly after her wedding”. The contents of the Letter were private when it was written and when it was published, even if the claimant, it now appears, realised that her father might leak its contents to the media.” [106]

 The case has been analysed on INFORRM by Brian Cathcart.

  1. Australian Competition and Consumer Commission v Google LLC (No 2) [2021] FCA 367

The Federal Court of Australia found that Google misled some users about the personal location data it collected through Android devices between January 2017 and December 2018.

The Court found that, in providing the option, “Don’t save my Location History in my Google Account”, represented to some reasonable consumers that they could prevent their location data being saved on their Google Account. In actual fact, users need to change an additional setting, separate, to stop their location data being saved to their Google Account.

Inforrm had a case comment.

  1. Hájovský v. Slovakia [2021] ECHR 591

Mr Hájovský placed an anonymous advert in a national newspaper offering payment to a woman in return for giving birth to his child. An investigative reporter posed as a candidate interested in surrogacy, replied to the advert and secretly filmed the ensuing meetings. These were later complied into a documentary. A national tabloid also covered the story using stills of footage and taking a critical stance of the applicants’ actions. Both stories revealed the applicant’s identity. This prompted the applicant to bring an action against the media groups for violation of his privacy under Slovakian law.

The Slovakian courts dismissed the application on the basis that the article contributed to a matter of public interest- the debate around surrogacy for payment and in any event the publishing of the advert had brought a private matter, the applicant’s wish to have a child, into the public domain.The ECtHR found in favour of the applicant. In doing so it reiterated the well-established balancing approach vis a vi privacy and freedom of expression as per Von Hannover and Axel Springer. In this instance the court found that the applicants right to privacy had been violated and that the Slovakian courts has erred in their approach to balancing the competing rights. In doing so the court make key observations about the privacy implications of photographs.

Inforrm has a case comment.

  1. Warren v DSG Retail Ltd [2021] EWHC 2168 (QB)

This case concerned the viability of claims for breach of confidence and misuse of private information against data controllers who have suffered cyber-attacks. In dismissing the claims for breach of confidence and misuse of private information Saini J found that both causes require some form of “positive conduct” by the defendant that is lacking where the cause of the private information being leaked is a cyber-attack.

Inforrm had a case comment.

6.  ES v Shillington 2021 ABQB 739

In this case the Alberta Court of the Queen’s Bench awarded damages under new “public disclosure of private fact” tort. The case concerned the making public of images of the claimant engaging in sex acts with the defendant- these had been shared during a romantic relationship between 2005 to 2016 where the parties had two children together. The parties had a mutual understanding that the images would not be shared or published anywhere. However, the defendant then proceeded to share the images online, including those involving the sexual assault of the claimant.

Delivering judgment for the claimant, Inglis J accepted their submissions that a new “public disclosure of private information” tort should be recognised as a separate cause of action from existing common law statutes.

Inforrm has a case comment.

  1. Hurbain v Belgium ([2021] ECHR 544)

 A case in which an order to anonymise a newspaper’s electronic archive was found not to breach the applicant publisher’s right to freedom of expression. This case reflects an important application of the right to be forgotten under article 8 of the Convention.  The applicant, Patrick Hurbain, is the president of the Rossel Group which owns one of Belgium’s leading French-language newspapers, Le Soir, of which he was previously Managing Editor. The article in question concerned a series of fatal car accidents and named one of the drivers, G, who had been convicted of a criminal offence for his involvement in the incidents. G made a successful application for rehabilitation in 2006.

However, Le Soir created a free, electronic, searchable version of its archives from 1989 onwards, including the article at issue.  G relied on the fact that the article appeared in response to a search on his name on Le Soir’s internal search engine and on Google Search. He explained that its availability was damaging to his reputation, particularly in his work as a doctor. The newspaper refused the application by stated it had asked Google to delist/deindex the article.

In 2012 G sued Mr Hurbain as editor of Le Sior and was successful domestically. Mr Hurbain then lodged an application with the Strasbourg Court complaining that the anonymisation order was a breach of Article 10. In balancing the article 8 and 10 rights in the case the Strasbourg Court found in favour of G.

Informm had a case comment.

  1. Peters v Attorney-General on behalf of Ministry of Social Development [2021] NZCA 355

The New Zealand Court of Appeal provided guidance in respect of the tort of invasion of privacy in this high-profile case. In 2017, the Ministry for Social Development (“MSD”) realised that Mr Peters, MP and leader of the New Zealand First Party, had overpaid New Zealand Superannuation (“NZS”). Due to errors NZS had been paid at the single rate when it should have been paid at the partner rate. Mr Peters immediately arranged for the overpaid amount to be repaid.

In August 2017 several reporters received anonymous calls in respect of the overpayment. To pre-empt any publicity, Mr Peters released a press statement addressing the incident. He also issued a claim for infringement of the tort of invasion of privacy against several MSD executives.  The High Court found the MSD executives were proper recipients of information and thus the claim failed.  The Court of Appeal dismissed Mr Peters’ appeal. For an invasion of privacy claim to succeed there is a two “limb” test:

  • the existence of facts in respect of which there was a reasonable expectation of privacy; and
  • that the publicity given to those private facts would be considered highly offensive to an objective reasonable person.

The Court agreed that limb one was met on the facts. However, the Court found that Mr Peters did not have a reasonable expectation of protection from disclosure of this information within MSD and from MSD to the relevant Ministers and select staff. As the claimant could not prove that any of defendants had released information to the media. The appeal was dismissed. The case affirmed the removal of the requirement for there to be widespread disclosure and the potential for the removal of the requirement that disclosure be highly offensive.

  1. R (Open Rights Group and the 3 million) v Secretary of State for the Home Department and Others [2021] EWCA Civ 800,

A case concerning “the lawfulness” immigration exemption found in paragraph 4 of Schedule 2 of the Data Protection Act 2018. This exemption allows those processing personal data for immigration control purposes to refuse to comply with the data subject rights guaranteed by the GDPR to the extent that complying with those provisions would prejudice those purposes.  The Court of Appeal found that this exemption was not compliant with Article 23 of the GDPR.

There was coverage from Hunton Andrews Kurth and 11KBW.

  1. Biancardi v. Italy [2021] ECHR 972

The ECtHR found that an order that the editor of an online newspaper was liable for failing to de-index an article concerning criminal proceedings did not breach Article 10 of the Convention. The case concerned an application for the delisting of an article concerning a fight involving a stabbing in a restaurant which mentioned the names of the those involved including the applicant V.X.

Inforrm had a case comment.

Suneet Sharma is a junior legal professional with a particular interest and experience in media, information and privacy law.  He is the editor of The Privacy Perspective blog.

Quotes from caselaw 2: Sicri v Assocated Newspapers [2020] EWHC 3541 (QB) – Privacy and suspicion by the state

The rationale for the general rule, that an individual has a reasonable expectation of privacy in respect of information that they have come under suspicion by the state, is clear: disclosure of such information is likely to have a seriously harmful impact on the person’s reputation, and thus their private life.

Warby J. at p.55

The Sicri case concerned the publication of an article by the Mail Online following the arrest of a man for having a connection with Manchester Arena suicide bomber Salman Abedi. The Mail Online did not remove the article after the claimants’ release and divulged his name via an alternative spelling, address and other identifiable details.  The claimant was successful and awarded £83,000 in damages as he had a reasonable expectation of privacy in respect of his identity remaining private when his arrest was reported. It should be noted that this reasonable expectation was assessed at pre-charge stage.

The claimant had a right to expect that the defendant would not publish his identity as the 23-year-old man arrested on suspicion of involvement in the Manchester Arena bombing. By 12:47 on 29 May 2017, the defendant had violated that right; it had no, or no sufficient public interest justification for identifying the claimant. It continued to do so. Later, another publisher did the same or similar. But the claimant’s right to have the defendant respect his privacy was not defeated or significantly weakened by the fact that others failed to do so. He is entitled to compensation. The appropriate sum is £83,000 in general and special damages.

Warby J. at 190

This is part of our new “quotes from caselaw” series, looking to bring you short snippets from leading judgments on privacy, which highlight its importance and development.

Privacy Law Monthly Round Up – September 2021

Headlines

Ben and Deborah Stokes’ privacy claim against The Sun for the highly intrusive article detailing traumatic events in the Stokes’ family past was settled on 30 August 2021, with the newspaper agreeing to publish an apology and pay substantial damages. Paul Wragg wrote about The Sun’s “nonsensical” defence for the Inforrm Blog, concluding that the only party spared the anguish of trial was the newspapers’ defence team.

Government and General legislative developments

The controversial Police, Crime, Sentencing and Courts Bill had its second reading in the House of Lords this month. The Bill is notorious for its proposed restrictions on peaceful protest, which critics have predicted will have a discriminatory impact and breach the rights to freedom of expression and assembly. Broadened police powers would also enable the extraction of more information from mobile phones.

The Age Appropriate Design Code (aka the “Children’s Code”) entered into force on 2 September 2021 following a one year transition period. The Children’s Code explains to businesses how the UK GDPR, Data Protection Act and Privacy and Electronic Communications Regulations apply to the design and delivery of Information Society Services (“ISS”) – i.e social media, educational and gaming platforms – apply to children. The Children’s Code is the first of its kind worldwide, and has been welcomed by many as a positive development for keeping children safe online. The 15 standards that the Code sets can be found here.

Sticking with child safety online, Home Secretary Priti Patel launched a Safety Tech Challenge fund at the G7 meeting start of this month. Five applicants will be awarded up to £85,000 each to develop new technologies that enable to detection of child sexual abuse material online, without breaking end-to-end encryption.

The UK Government has launched a public consultation on data protection legislation reform following Brexit entitled Data: A new direction. The consultation is open until 19 November. Following the end of the Brexit transition period, the UK’s data protection regime, which had derived from the EU framework, will be transposed into domestic law known as the UK GDPR. The Government is seeking to use this opportunity to make some changes to the current regime. The Hawtalk Blog discusses how some of these proposals are unethical and unsafe. Further discussion can be found on the Panopticon Blog and the Data Protection report

Data Privacy and Data Protection

Cressida Dick, the Metropolitan Police Commissioner, has accused tech giants of undermining terrorist prevention efforts by virtue of their focus on end-to-end encryption. Writing in The Telegraph on the twentieth anniversary of the 9/11 attacks, she said that it is “impossible in some cases” for the police to fulfil their role to protect the public. Given the pressure on tech giants to ensure users’ privacy, companies are unlikely to reshape their platforms to facilitate more extensive monitoring.

Apple has delayed its plan to scan its users’ iCloud images for child sexual abuse material. The proposed detection technology would compare images before they are uploaded to iCloud against unique “digital fingerprints” of known child pornographic material maintained by the National Centre for Missing and Exploited Children. The plan was criticised by privacy groups because it involved using an individual’s own device to check if they were potentially engaged in criminal activity.

Surveillance

The Metropolitan Police have invested £3 million into new facial recognition technologies (FRT) that will greatly increase surveillance capabilities in the capital. The expansion of the Met’s technology will enable it to process historic images from CCTV feeds, social media and other sources in order to track down suspects. Critics argue that such FRT encroaches on privacy by “turning back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years.” There is also concern that FRT can exacerbate existing racial discrimination in the criminal justice system. The UK’s Surveillance Camera Commissioner (SCC), Professor Fraser Sampson, has acknowledged that some FRT “are so ethically fraught” that it may only be appropriate to carry them out under license in the future.

NGO’s

Big Brother Watch published an opinion piece warning that the imposition of vaccine passports could reorganise Britain into a two-tier, checkpoint society. The article responds to the Scottish Parliament’s vote in favour of vaccine passports earlier this month. Wales has since followed Scotland and announced mandatory vaccination and COVID status check schemes. The English government has not yet committed to such a regime. The ICO has emphasised that data protection laws will not stand in the way of mandatory vaccination and COVID status checks, but rather facilitate responsible sharing of personal data where it is necessary to protect public health. 

Privacy International has considered how data-intensive systems and surveillance infrastructure, developed by national and foreign actors, in Afghanistan as part of developmental and counter-terrorism measures will fare under the Taliban regime.

From the regulator

ICO

The ICO has announced two fines this month;

  • A total of £495,000 was imposed against We Buy Any Car, Saga, and Sports Direct for sending more than 354 million “frustrating and intrusive” nuisance messages between them. None of the companies had permission to send recipients marketing emails or texts, making their behaviour illegal;
  • The Glasgow-based company DialADeal Scotland Ltd was fined £150,000 for the making of more than 500,000 nuisance marketing calls to recipients who had not given their permission to receive them.

The ICO has also released a communiqué from a meeting on data protection and privacy held by the G7 authorities at the start of the month. The meeting is closely aligned with the Roadmap for Cooperation on Data Free Flow with Trust announced by G7 Digital and Technology Ministers on 28 April 2021.

IPSO

IPSO has published a number of privacy rulings and resolutions;

IMPRESS

There were no IMPRESS rulings relating to privacy this month.

Cases

The Inforrm Blog has published an article detailing the continued decline in privacy injunction applications in England and Wales for 2021. There were only three applications in the first six months of the year, down from ten in 2020. All three applications were successful. Only 4% of the new issued cases on the Media and Communications List related to misuse of private information or breach of privacy.

No judgements relating to privacy have been handed down this month.


Written by Colette Allen

Colette Allen has hosted “Newscast’” on The Media Law Podcast with Dr Thomas Bennett and Professor Paul Wragg since 2018. She has recently finished the BTC at The Inns of Court College of Advocacy and will be starting a MSc in the Social Sciences of the Internet at the University of Oxford in October 2021.

Healthcare data and data protection in the time of coronavirus – Olivia Wint

The processing of special category personal data (including health data e.g. vaccination status, blood type, health conditions etc) was a common topic before the COVID-19 pandemic (the “pandemic”), with various resources published that explored this topic.

For example, the European Data Protection Board (“EDPB”) published an adopted opinion on the interplay between the Clinical Trials Regulation and the General Data Protection Regulation* (“GDPR”) (23January 2019), the Information Commissioner’s Office (“ICO”) posted a blog on why special category personal data needs to be handled even more carefully (14 November 2019) and the ICO published guidance on the lawful basis for processing special category data compliance with the GDPR (November 2019).

The pandemic has brought about a number of data protection considerations, all of which were already in existence but exacerbated by the pandemic (employee monitoring, contact tracing, workforce shift from office to home etc.) One that is more prevalent than ever before is the processing of health data, this piece aims to cover some key data protection themes and practical insights into the processing of health data.  

Health data, a subset of special category personal data by its very nature comes with an increased risk profile.  When processing this data type, not only are there legislative data protection requirements, the expectation of good clinical governance practices but also regulatory body considerations too.                                                           

For example, the NHS Care Quality Commission have in place a code of practice on confidential personal information, the NHS Health Research Authority have in place GDPR guidance specifically for researchers and study coordinators and technical guidance for those responsible for information governance within their organisation and the NHS more generally, has in place it’s Data Security and Protection Toolkit (the “Toolkit”). The Toolkit is an online self-assessment tool that enables organisations to measure and publish their performance against the National Data Guardian’s ten data security standards. The Toolkit covers records management and retention, training and awareness, system vulnerability management and crisis management to name a few.                                                                  

The above is all on a national level (UK), on an international level, there are data protection laws which specifically cover health data such as HIPAA in the US, the Patient Data Protection Act in Germany, and various provincial health data privacy laws in Canada such as the Health Information Act in Alberta.

Whilst the previous paragraph highlights the complexities of processing health data whether on a national and international level in comparison to other data types, there are a number of mitigations that organisations can put in place to adequately reduce the risks associated with processing this type of data. Mitigations such as Data Protection Impact Assessments (“DPIAs”), updated privacy notices and appropriate security measures amongst other things should all be considered.

Many organisations that never historically processed health data may now do so as a result of the pandemic…

Covering your bases

The first base that must be covered when processing data is ensuring that an appropriate legal basis has been established for each data processing activity, so for example if health data is processed for employee monitoring and research, a legal basis for both of these activities will need to be established. Legal bases can include for the performance of a contract, for legitimate interests** of the organisation and/or in order to perform a legal obligation.  Where processing of health data is concerned an additional category under Article 9 of the UK GDPR must be met. In the healthcare context, applicable additional categories may include explicit consent, health or social care purposes, public health purposes and/or archiving research and statistical purposes. 

Many organisations that never historically processed health data may now do as a result of the pandemic or alternatively organisations that processed health data pre-pandemic may now be doing so in larger amounts, organisations that fit either side of the coin should also assess the extent to which their privacy notice(s) have been updated and/or need to be updated in order to make data subjects aware any applicable data processing changes and to comply with transparency obligations.

Next, large scale processing of health data may pose a ‘high risk to the rights and freedoms of natural persons’ and in such cases, will trigger the requirement of a DPIA. In order for a DPIA to have value, it is important for organisations to ensure that the DPIA is assessed and considered early on to ensure privacy by design and default is incumbent of any system or processing activity.

A DPIA will assess the likelihood and severity of harm related to the processing activity in question and should the DPIA identify a high risk with no available mitigations, consultation with the ICO will be needed. The ICO has set out a 9-step lifecycle for the DPIA, all of which should be considered before any data processing has taken place:

  1. Identify a need for a DPIA;
  2. Describe the processing;
  3. Consider consultation;
  4. Assess necessity and proportionality;
  5. Identify and assess risks;
  6. Identify measures to mitigate risk;
  7. Sign off and record outcomes;
  8. Integrate outcomes into plan; and
  9. Keep under review.

Internally, organisations should have appropriate technical and organisational measures in place which reflects the risk presented. In relation to technical measures, appropriate internal controls and security measures should be utilised. Organisations may wish to consider a myriad and combination of controls to ensure that health data has the best level of protection, this may include end to end encryption for data both in transit and at rest, role-based access within organisations and the adoption and accreditation of industry recognised security standards such as ISO 27001.

In respect of organisational measures, it may be apt for training and awareness sessions to be implemented with tailored training administered to employees that will doing data processing activities and a robust policy suite in place which covers key circumstances such as data breaches and business continuity.

Data sharing

A specific data processing activity that may be utilised more in the wake of the pandemic is that of data sharing between organisations for information and research purposes. In the England, the soon to be implemented GP Data Sharing Scheme aims to improve and create a new framework for creating a central NHS digital database from GP records and the UK’s Department of Health and Social Care (“DHSC”) has recently published a draft policy paper titled ‘Data saves lives: reshaping health and social care with data’. The policy covers the aspiration of the DHSC to introduce new legislation as part of the Health and Care Bill (currently at Committee stage) to encourage data sharing between private health providers and the NHS and have more guard rails around the sharing of data generally through mandating standards for how data is collected and stored.

With data sharing as evidenced by the above, is something that will be advocated for and welcomed in due course, it is important that organisations have in place the appropriate contractual and practical measures to protect data as data in motion is when it is most vulnerable. Contractual measures include ensuring data sharing and/or transfer agreements are in place which cover all necessary contractual provisions and provide adequate assurances as to the data sharing/transfer arrangements. The NHSX has published a template Data Sharing Agreement which has been labelled as suitable for use by all health and care organisations and includes risk management, legal basis and confidentiality and privacy provisions amongst other things. Practical measures include conducting due diligence checks on all organisations which may be in receipt of data as part of the data sharing process (including third parties) and anonymising/ pseudonymising data. The ICO has put in place a comprehensive data sharing checklist which invites organisations to consider data minimisation, accountability and data subject rights.

The pandemic has changed the world that we knew it in more ways than one and in the context of processing of health data, what seems to be certain is that the processing of health data is on the rise. As such, organisations should continue to monitor guidance and developments in this area and ensure data protection principles are at the core of all data processing activities as a first port of call.

* EDPB guidelines are no longer directly relevant to the UK data protection regime and are not binding under the UK regime.

** A legitimate interest assessment should be considered when relying on legitimate interest as a lawful basis.

Olivia Wint is a seasoned data protection professional, with over five years experience in this area. Olivia has worked in a range of sectors including local authority, third sector, start-ups and the Big 4 advising on all aspects of data protection compliance.

An Introduction to English laws tackling revenge pornography – Colette Allen

As the UK moved online in response to the COVID-19 pandemic, reports of image-based abuse – ‘revenge porn’ – doubled. One reason for the increase is that the national lockdown pushed dating lives online, and the sharing of sexual images became one of the few ways to show intimacy. Disclosing, or threatening to disclose, intimate images has a massive psychological toll on victims, and is therefore an effective means of exerting control. Financial pressure, a surge in domestic violence, and relationship breakdowns have contributed to the rise of reported cases.

Too often, the victim is blamed when their image ends up online. This response disregards the victim’s right to privacy and denies them of their sexuality. Most would agree that a person’s consent to have sex with another does not amount to consent to sleep with all of his/her friends – but that is the very logic of those who say individuals ‘should have been more careful’ when their image is disclosed.

If you are a victim of revenge porn, the law can help you regain control and achieve justice.

The uploading of sexual or intimate images online, without the consent of the individual pictured, and with the intention to cause the victim humiliation or embarrassment, is a criminal offence in England and Wales.

The relevant law differs depending on whether or not the victim is over 18 years of age.

Section 33 of the Criminal Justice and Courts Act 2015 (‘CJCA 2015′) applies to adult victims and establishes a maximum sentence of 2 years’ imprisonment following conviction.

For s.33 CJCA 2015 to apply, the image(s) must be private and sexual. Certain parts of the body, like exposed genitals or pubic area, are considered inherently private for the purposes of the offence. Posing in a sexually provocative way will be regarded as private if the image depicts something that would not ordinarily be seen in public.

The victim must show that the reason, or one of the reasons, that their intimate image was shared without their consent was to cause the victim distress (the ‘distress element’). Without proving this, a victim will not be able to secure a conviction against the defendant. The distress element is a distinct part of the trial that will require its own evidence. It is not enough that distress is or would be a natural consequence of the disclosure.

Doctored and computer-generated images, also known as ‘deep fakes’, are not covered by the CJCA 2015. A victim who has had an innocent image transposed onto a pornographic photograph or film does not, unfortunately, have any specific law to draw on. Victims in this scenario should, however, discuss with their solicitor the possibility of securing a conviction under section 1 of the Malicious Communications Act 1988 and/or section 127 of the Communications Act 2003. Victims pursuing this route will still have to find evidence for the distress element in order to secure a conviction, as both s.1 and s.127 require that the message be sent to cause distress or anxiety, or be of a menacing character, respectively.

It is not guaranteed that a victim of revenge porn will be able to secure legal aid funding, but this is something you should ask your solicitor.

If you are a victim of revenge porn, the law can help you regain control and achieve justice.

Defences

It is a complete defense if the defendant reasonably believed that the disclosure was necessary for the investigation, prevention or detection of crime (s.33(3) CJCA 2015), or if the image is disclosed by a journalist who reasonably believes that publication is in the public interest (s.33(4)). A journalist relying on the s.33(4) defense will have to show that there was a legitimate need to publish the photograph or film that goes to the value of a story on an important matter. ‘Public interest’ in this context is not simply something with which the journalist believes the public will be interested.

It is a defense if the defendant believed that the image(s) had previously been made public for financial purposes, i.e. commercial pornography (s.33(5) CJCA 2015). However, a defendant will not be able to rely on s.33(5) if they had reason to believe that the victim had not consented to prior release.

Anyone who forwards on the image(s) without the victim’s consent is only guilty of a s.33 offence if they do so with the intention to cause the victim distress. Re-sending the image(s) as a joke or for sexual gratification will not amount to an offence merely because distress was a natural consequence of their actions (s.33(8)).

Children 

Possessing, taking, distributing or publishing sexual images of individuals under the age of 18 are offences under section 1 of the Protection of Children Act 1978 and section 160 of the Criminal Justice Act 1988. If you are under the age of 18 and your image has appeared online, the process is much simpler than if you were an adult. There is no need to show a distress element on behalf of the defendant.

Parents who have been made aware that their children have shared or have been sent sexual images should be aware that Crown Prosecution Service Guidelines on revenge pornography makes clear that consensual ‘sexting’ between minors of a similar age is not to be treated as an offence. Where there is evidence of grooming, harassment or exploitation then it will be treated as a criminal matter.

Websites

The CJCA 2015 makes it possible for the website operator who hosts the site on which an intimate image was illegally shared to be liable, but only when the operator has actively participated in the disclosure, or failed to remove the material once they have been made aware that it is criminal in nature. In reality, most social media sites will be compliant in removing such material on request.

If any of the matters discussed in this article affect you, visit https://revengepornhelpline.org.uk


Colette Allen has hosted “Newscast’” on The Media Law Podcast with Dr Thomas Bennett and Professor Paul Wragg since 2018. She has recently finished the BTC at The Inns of Court College of Advocacy and will be starting a MSc in the Social Sciences of the Internet at the University of Oxford in October 2021.

A look at the European Data Protection Board guidance on supplementary measures – Olivia Wint

Data transfers have been a prominent topic in the data protection world in recent months, with the UK’s recent adequacy decision adding to the conversation on the topic.

On 21 June 2021, the European Data Protection Board (“EDPB”) published the final version of Recommendations on supplementary measures (the “Recommendations”). For context, the first draft Recommendations which were published in November 2020 were prompted as a result of the much-anticipated Schrems II judgment which was handed down in July 2020.

The Schrems II judgment comes after the Schrems I judgment, which in 2015, invalidated the Safe Harbour regime in 2015. The focal point of the Schrems II case concerned the legitimacy of standard contractual clauses (“SCCs”) as a transfer mechanism in respect of cross border data transfers from the EU to the US. Max Schrems, a privacy advocate argued that Facebook Ireland transferring a significant amount of data to the US was not adequate due to the US’ surveillance programmes. Schrems argued that this fundament tally affected his right to ‘privacy, data protection and effective judicial protection’.  Rather unexpectedly, the Court of Justice in the European Union (“CJEU”) declared the invalidity of the privacy shield in this case and whilst SCCs were not invalidated, the CJEU laid down stricter requirements for cross border transfers relying on SCCs, which included additional measures to ensure that cross border transfers have ‘essentially equivalent’ protection to that of the General Data Protection Regulation 2016/ 679 (“GDPR”).

As a result of the Schrems II judgment and the invalidation of the privacy shield, the estimated 5300 signatories to this mechanism now need to seek alternate transfer mechanisms and companies on a transatlantic scale have been forced to re-examine their cross-border transfers. As such EDPB’s Recommendations could not have come sooner for many in the privacy world. 

Based on the Schrems II judgment, supplementary measures are in essence additional safeguards to any of the existing transfer mechanisms as cited in Article 46 GDPR, which include SCCs, binding corporate rules (“BCRs”) and approved code of conducts to name a few with the overarching objective of the supplementary measures to ensure the ‘essentially equivalent’ threshold is met.

The EDPB’s Recommendations, outline six key steps which comprise part of an assessment when deducing the need for supplementary measures:

  1. know your transfers;
  2. identify the transfer mechanism(s) you are relying on;
  3. assess whether the transfer mechanism you are relying on is effective in light of all circumstances of the transfer);
  4. identify and adopt supplementary measures;
  5. take any formal procedural measures; and
  6. re-evaluate at appropriate intervals.

Step 1- know your transfers

Step 1 concerns organisations having a good grasp on their data processing activities, mainly evidenced through data mapping and/or records of processing activities (“ROPAs”). As ROPAs are a direct obligation under the GDPR, in theory for most organisations it will be a case of ensuring that the ROPA accurately reflects any new data processing that has occurred (with the inclusion of any third parties).

Key data protection principles should also be considered for example, lawfulness, fairness and transparency (does the privacy policy make it clear that cross border transfers are taking place?), data minimisation (is the whole data set being transferred or just what is relevant?) and accuracy (have data quality checks been conducted on the data in question?).

The Recommendations stipulate that these activities should be executed before any cross-border transfers are made and highlights the fact that cloud storage access is also deemed to be a transfer too.

Step 2- identify the transfer mechanism(s) you are relying on

There are a number of transfer mechanisms that can be relied on for cross border data transfers, such as SCCs, BCRs, codes of conduct etc and adequacy decisions and this step requires organisations to identify the mechanism that will be used for the transfer.

EDPB has noted for organisations that will be using the adequacy decision as their desired transfer mechanism, the subsequent steps in the Recommendations can be discarded.

N.B. to date, the European Commission has only recognised Andorra, Argentina, Canada (commercial organisations only), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Switzerland, Uruguay and the UK.

Step 3- Assess whether the transfer mechanism you are relying on is effective in light of all circumstances of the transfer

This is a critical part of the assessment and requires organisations to assess/ examine the third country’s legislation and practices to ascertain the extent to which there are limitations which may mean the protection afforded as a result of the cross-border transfer is less than ‘essentially equivalent’. The Recommendations affirm that the scope of the assessment needs to be limited to the legislation and practices relevant to the protection of the specific data you transfer. The legislation and/or practices examined must be publicly available in the first instance, verifiable and reliable.

Key circumstances which may influence the applicable legislation/ and or practices include (but are not limited to):

  • purpose for data transfer (marketing, clinical research etc);
  • sector in which transfer occurs (financial, healthcare etc);
  • categories of personal data transferred (children’s data, health data etc); and
  • format of the data (raw, pseudonymised, anonymised, encrypted at rest and in transit etc).

The assessment should be holistic in nature and cover all relevant parties such as controllers, processors and sub- processors (as identified in Step 1) and should consider the effectiveness of data subject rights in practice.

Examining of legislation and practices is of utmost important in situations when:

  1. legislation in third country does not formally meet EU standards in respect of rights/freedoms and necessity and proportionality;
  2. legislation in third country may be lacking; and
  3. legislation in third country may be problematic.

The EDPB stipulates that in scenarios i) and ii) the transfer in question has to be suspended, there is more flexibility in scenario iii) where the transfer may be either be suspended, supplementary measures may be implemented or continue without supplementary measures if you are able to demonstrate and document that the problematic legislation will not have any bearing on the transferred data.

Step 4- Identify and adopt supplementary measures

If as a result of Step 3, the assessment concludes that the transfer mechanism is not effective with third legislation and/ or practices, then the Recommendations urge that consideration needs to be given to whether or not supplementary measures exist that can ensure ‘essentially equivalent’ level of protection. Supplementary measures can be in a myriad of forms which include technical (controls such as encryption), organisational (procedures) and contractual and must be assessed on a case-by-case basis for the specific transfer mechanism.

N.B. A non-exhaustive list of supplementary measures include can be found in Annex 2 of the Recommendations.

Step 5- Take any formal procedural measures

A recurring theme throughout the Recommendations is the need for a nuanced approach to be adopted when assessing each specific transfer mechanism and as such, the procedural measures that will need to be taken are dependent on the specific transfer mechanism with some mechanisms requiring supervisory authority notification.

Step 6- Re-evaluate at appropriate intervals

As with all aspects of compliance, monitoring and re-evaluating of supplementary measures should be done frequently, the Recommendations do not explicitly define a time period, however factors which could impact the level of protection on transferred data such as developments in third country legislation will cause re-evaluation.

One of the main aims of the GDPR (and also one of the key principles) is that of accountability and the EDPB’s Recommendations on supplementary measures bolsters this premise. There is emphasis placed on documentation which adequately considers and records the decision-making process at each of the six steps to ensure organisations have an accurate audit trail.

In addition to the EDPB’s Recommendations, it is important for organisations (especially global ones) to take heed of any local developments in this area. With the CNIL already publishing guidance, the ICO expected to issue guidance and the Bavarian Data Protection Authority’s ruling against Mailchimp in this area, it can be said that supplementary measures will be crux of many impending data protection developments.

Olivia Wint is a seasoned data protection professional, with over five years experience in this area. Olivia has worked in a range of sectors including local authority, third sector, start-ups and the Big 4 advising on all aspects of data protection compliance.