Privacy Law in Practice – An Insight into Data Protection Law as an In-House IT Lawyer – Madeleine Weber

Welcome to Privacy Law in Practice, our series at TPP demystifying what it is like to practice in privacy law.

Have you ever wondered which data protection law issues come up in practice? It obviously depends on the industry and area you work in, but data protection law might be more prevalent than you think.

Continue reading

The Online Safety Bill: Everything in Moderation? – Naomi Kilcoyne – Part VI, Updates to the Bill

PART VI: UPDATES

Any commentary upon legislation in progress risks rapidly becoming outdated: an occupational hazard to which this piece is by no means immune.

Ahead of the OSB’s return to Parliament, the Government issued a press release on 28 November 2022 noting a number of important developments to the amended Bill.

Continue reading

The Online Safety Bill: Everything in Moderation? – Naomi Kilcoyne – Parts III, IV and V

PART III: CRITICISM

In a rare show of national unity, disapproval of the OSB has spanned both ends of the political spectrum. Alongside criticism from the Labour culture minister, Conservative politicians have also weighed in on the ‘legal but harmful’ debate. Thinktanks and non-profit groups have likewise been apprehensive.

Perhaps most headline-grabbing was the censure of the former Supreme Court judge, Lord Sumption, who denounced the OSB in an article in The Spectator, and subsequently on the Law Pod UK podcast.

Continue reading

The Online Safety Bill: Everything in Moderation? – Naomi Kilcoyne – Parts I and II

A number of Bills proposed by the recent Conservative governments have sparked controversy among commentators: among them, the Northern Ireland Protocol Bill, the Retained EU Law Bill, and the ‘British’ Bill of Rights Bill. Taking its place in the rogues’ gallery is the Online Safety Bill (OSB).

Now returning to the House of Commons on 5 December 2022 to finish its Report Stage, the OSB has come some way since the ‘Online Harms’ White Paper published in April 2019. The Bill raises important questions about freedom of expression, online speech regulation and government (over)reach.

This article has four principal components.

Part I lays out the content and objectives of the Bill, highlighting its legislative development and the key issues arising from that. Part II situates the Bill within the wider context of online regulation, considering how recent developments may inflect the Bill’s impact.

This provides the framework for Part III, which addresses the various criticisms that the Bill has received from commentators across political spectrum. Part IV then examines the broader legal and theoretical consequences of the Bill, posing further questions to be answered. Some conclusions are drawn in Part V.

An appended Part VI briefly outlines the most recent updates to the Bill.

PART I: CONTENT

Much of the OSB’s content was clarified by the Commons Digital, Culture, Media and Sport (DCMS) Committee Report in January 2022, and the Government’s Response to this in March 2022.

As these reports confirmed, the main priority of the OSB is evident from its name change. Now couched in broader terms, the Bill is designed to protect internet users’ online safety by way of three central objectives (Response, at [2]):

  1. To tackle illegal content and activity.
  2. To deliver protection for children online.
  3. To afford adults greater control, while protecting freedom of expression.

To achieve these objectives, the Bill operates on a duty of care model. Under this model, online platforms are liable only for their own conduct: the Bill seeks to hold platforms responsible for systemic ‘lack of diligence in failing to adopt preventive or remedial measures’ (Report, at [7]). This is, in theory, a less stringent regulatory model than ‘intermediary liability’, under which online platforms would also be liable for others’ content and activity.

Moreover, service providers will not owe a limitless duty of care (Report, at [4]). Instead, the Bill divides providers into various categories, which in turn are subject to specific duties. For example, Category 1 (high-risk and high-reach, user-to-user) services are deemed to be the largest and most risky, so incur additional duties as compared to Categories 2A (all regulated search services) and 2B (the remaining regulated user-to-user services).

Enforcement of such duties lies not with the government, but with the regulatory authority Ofcom, to which the legislation grants overseeing and enforcing powers (Response, at [3]).

Central to the Bill’s duty of care model is its typology of online content. Initially, the OSB distinguished illegal from legal material, the latter of which it subdivided into two – producing three content typologies to align with the Bill’s stated objectives:

  1. Illegal content
  2. Legal but harmful content
    1. Content that is harmful to children
    1. Content that is harmful to adults (for Category 1 services)

The Bill originally defined each type of content as follows (Report, at [5]):

  • Illegal content: content whose use / dissemination constitutes a relevant offence
  • Content harmful to children and adults:
    • Designated – content of a type designated in regulations made by the Secretary of State
    • Non-designated – content which fulfils one of the general definitions
      • These apply where the provider has reasonable grounds to believe that there is a material risk of the content having (even indirectly) a significant adverse physical / psychological impact on a child or adult (as applicable) of ordinary sensibilities.

These definitions were essential to the Bill’s regulatory framework, since they directly underpinned the associated risk assessment and safety duties (Report, at [6]). Simply put, how content is defined determines what a provider is required (or not) to do about it. The lower the definitional bar, the more content is subject to regulation – and, potentially, removal.

While illegal content has certainly provoked discussion, controversy has principally surrounded the ‘legal but harmful’ debate. The regulation of such content begs the question: can moderation be justified where the content, by its nature, does not meet the criminal standard?

Of particular interest are the Government’s subsequent amendments to the draft Bill, following the DCMS Report. Despite accepting eight of the Committee’s recommendations, the Government’s Response stated in the legal but harmful context that ‘rather than using the Committee’s proposed reframing, we have made other changes that meet a similar objective’ (at [29]).  

As the Bill stood in March 2022, the Government had amended its position in the following key areas:

  1. Definition of ‘harmful’ – This was simplified under the revised Bill: content had to present a material risk of significant harm to an appreciable number of children/adults (Response, at [30]). The key threshold to engage safety duties was one of ‘priority’ harmful content.
  • Designation of types of harmful content – As envisaged in the draft Bill, priority content harmful to children and adults was to be designated by the Secretary of State in secondary legislation, following consultation with Ofcom. This would now be subject to the affirmative resolution procedure, to maximise parliamentary scrutiny (Response, at [12], [55]-[57]). The government also published an indicative list of what might be designated under the Bill as priority harmful content.
  • Non-priority content harmful to adults – The revised Bill removed the obligation upon service providers to address non-priority content harmful to adults. Companies were required only to report its presence to Ofcom (Response, at [31], [40]).

According to a Ministerial Statement released in July 2022, service providers’ safety duties regarding ‘legal but harmful’ content could thus be broken down as follows:

  1. Children – Primary priority content harmful to children
    1. Services must prevent children from encountering this type of content altogether
  • Children – Priority content harmful to children
    • Services must ensure content is age-appropriate for their child users
  • Adults – Priority content harmful to adults
    • Applies only to Category 1 services
    • These must address such content in their terms and conditions, but may set their own tolerance: this may range from removing such content, to allowing it freely.

PART II: CONTEXT

To understand the ‘legal but harmful’ debate more fully, we must situate the OSB in context.

Europe:

In the EU, the recently adopted Digital Services Act (DSA) shares some similarities with the OSB: both provide a legal framework for online platforms’ duties regarding content moderation.

However, Dr Monica Horten has identified the following distinctions:

  • The DSA focuses on regulating illegal rather than merely ‘harmful’ content. In doing so, according to the non-profit Electronic Frontier Foundation, the DSA ‘avoids transforming social networks and other services into censorship tools’ – a position from which the OSB’s broader scope deviates.
  • The DSA unequivocally recognises the right to freedom of expression as guaranteed by Article 11 of the Charter of Fundamental Rights, in accordance with which service providers must act when fulfilling their obligations. The adequacy of free speech protection under the OSB may be less assured, as considered below.
  • The measures also differ in their provision of redress. While the DSA includes both prospective and retrospective procedural safeguards for users who have acted lawfully, the OSB arguably falls short – despite the Government’s assurance that users’ access to courts would not be impeded by the Bill’s ‘super-complaints mechanism’ (Response, at [18]).

It is also worth noting the proposed European Media Freedom Act (EMFA), broadly hailed as a positive step for journalistic pluralism within the EU. Granted, the OSB purports to exclude the press (‘news publishers’) from its content moderation rules. However, uncertainty remains as to the possible regulation of comments sections on newspaper websites, not to mention newspapers’ own activity on social media.

USA:

Across the Atlantic, the US courts show some signs of a legal vacuum developing around over-moderation. Recent attempts by social media users to challenge online content moderation by asserting their First Amendment rights have failed, on the basis that sites such as Facebook and Twitter are not ‘state actors’, but rather private actors not subject to constitutional claims.

As a counterpoint, the recent takeover of Twitter by Elon Musk may illustrate the risks of under-moderation. Concerns are particularly acute in light of Musk’s reinstatement of banned high-profile accounts – having stated he would wait until a new ‘content moderation council’ had convened – and his announcement of a general amnesty. This follows the removal of thousands of Twitter content moderators, and swift resurgence of hate speech and misinformation.

UK:

Returning to the UK, the wider position of freedom of expression is somewhat ambiguous.

On the one hand, the aforementioned Bill of Rights Bill (BORB) claims to improve safeguards: clause 4 requires judges to give ‘great weight’ to protecting freedom of expression. However, the former Deputy President of the Supreme Court, Lord Mance, has queried how different this is to the ‘particular regard’ provision in s 12(4) of the HRA. Other commentators have questioned whether this presumptive priority of Article 10 may in fact skew the balance in privacy actions, which rely on the presumptive parity between Articles 8 and 10. On either analysis, the BORB’s parallel statutory attempt to enshrine freedom of expression – recalling the OSB’s third objective – is not encouraging.

On the other hand, calls for greater online regulation have gained traction following the inquest into the death of the British teenager Molly Russell. The senior coroner found in October that the 14-year-old had suffered from ‘the negative effects of on-line content’, calling inter alia for ‘the effective regulation of harmful on-line content’, and for legislation ‘to ensure the protection of children’ against its effects. This offers a compelling policy argument in favour of the OSB’s second objective.

This overview of the Bill’s content and context provides the factual basis for a normative analysis of its criticisms and consequences in Parts III and IV.

Naomi Kilcoyne is a Visiting Lecturer in Public Law at City University, having completed her GDL there in 2021-22. She has a particular interest in the interplay between public and private law.

Attorney General v BBC [2022] EWHC 1189 (QB) – High Court considers what information can be made public about alleged MI5 CHIS

In a judgment handed down on 18 May 2022 the High Court has considered what information be BBC can publish in a story pertaining to the actions of an alleged MI5 covet human intelligence source (“CHIS”).

The BBC alleged that X was a CHIS and had been psychologically and sexually abusive to two female partners.

The judgment can be found here: https://www.bailii.org/ew/cases/EWHC/QB/2022/1189.html

The judgment is in two parts- one heard in public and the other in private. The private hearing was held to be necessary so that the Court could hear submissions about information that, if released to the public, would make the identity of the alleged CHIS known.

Mr Justice Chamberlian comments: “The court must be alert to the possibility of “jigsaw” identification. One piece of information may on its own seem innocuous, but when taken together with other information known to a particular malign actor, it may lead to the identification of an individual with greater or lesser confidence. The threat of jigsaw identification is a familiar feature of arguments against disclosure in closed material proceedings in the national security context. It is regularly deployed as a basis for refusing to disclose information known only from covert sources. But, although the court must be alive to the threat of jigsaw identification, it must also be astute not to allow the threat to justify a blanket prohibition on disclosure of any piece of the jigsaw.

at p.24

The BBC’s article on the case can be found here: https://www.bbc.co.uk/news/uk-61528286

The intial BBC coverage of this matter here: https://www.bbc.co.uk/news/uk-61508520

And details of one of X’s former partners’ legal action to be taken against MI5 here: https://www.bbc.co.uk/news/uk-politics-61521569

Citation: The Guardian: Privacy laws could be rolled back, government sources suggest – A rebuttal

The Guardian has a piece suggesting, following the judgment of the UK Supreme Court this week in ZXC, that privacy laws could be rolled back by replacements to the Human Rights Act.

Following the judgment in ZXC a government spokesperson has stated: “A free press is one of the cornerstones of any democracy. The government recognises the vital role the media plays in holding people to account and shining a light on the issues which matter most. We will study the implications of the judgment carefully.”

Whilst political sources are usually careful not to criticise judges, the balance between freedom of expression and privacy rights of individuals is a contentious area, drawing critical voices from both sides of the debate. TPP advocates balance between the two competing rights.

Continue reading

Bloomberg v ZXC: UK Supreme Court finds that suspects of crime have a reasonable expectation of privacy in investigation details pre-charge

Judgment has been handed down by the UK Supreme Court in the appeal in the case of Bloomberg v ZXC. The court has found for the respondent, refusing the appeal.

The case has significant implications for the law of privacy. It endorses the finding in the Cliff Richard case and provides crucial precedent on the reasonable expectation of privacy suspects of crime can expect. TPP will have further coverage of the judgment shortly. See the judgment here.

“The courts below were correct to hold that, as a legitimate starting point, a

person under criminal investigation has, prior to being charged, a reasonable

expectation of privacy in respect of information relating to that investigation and that

in all the circumstances this is a case in which that applies and there is such an

expectation.”

at p.146

Top 10 Defamation Cases 2021: a selection – Suneet Sharma

Inforrm reported on a large number of defamation cases from around the world in 2020.  Following my widely read posts on 2017,  2018,  2019 and 2020 defamation cases, this is my personal selection of the most legally and factually interesting cases from England, Australia and Canada from the past year.

Please add, by way of comments, cases from other jurisdictions which you think should be added.

  1. Fairfax Media Publications Pty Ltd; Nationwide News Pty Limited; Australian News Channel Pty Ltd v Voller [2021] HCA 27

The controversial finding of the majority of the High Court of Australia that news organisations were publishers of third-party comments on their Facebook pages.

Mr Voller brought defamation proceedings against a series of media organisations alleging that each of the applicants became a publisher of any third party comment on its Facebook once it was posted an read by another user. He was successful at first instance and the successive appeals against the finding was rejected.  The position was summarised as follows

“each appellant intentionally took a platform provided by another entity, Facebook, created and administered a public Facebook page, and posted content on that page. The creation of the public Facebook page, and the posting of content on that page, encouraged and facilitated publication of comments from third parties. The appellants were thereby publishers of the third-party comments” [105].

Inforrm had a post about the decision.

The Australian Government are already proposing to reverse the effect of this decision by statute – see the Inforrm post here.

  1. Lachaux v Independent Print Limited [2021] EWHC 1797 (QB)

In the latest instalment in the long running saga of the Lachaux libel litigation, Mr Justice Nicklin dismissed the Defendants’ public interest defence and ordered the publishers of The Independent, The i and the Evening Standard newspapers to pay £120,000 in libel damages to aerospace engineer Bruno Lachaux. The defendants falsely alleged he had, amongst other things, been violent, abusive and controlling towards his ex-wife, that he had callously and without justification taken their son away from her, and that he had falsely accused his ex-wife of abducting their son.

The Judge provided important commentary on the standards to be upheld by defendants seeking to establish the public interest defence to what would otherwise be considered defamatory coverage.  He said:

I have no hesitation in finding that it was not in the public interest to publish [Articles], which contained allegations that were seriously defamatory of the Claimant, without having given him an opportunity to respond to them. The decision not to contact the Claimant was not a result of any careful editorial consideration, it was a mistake …journalists and those in professional publishing organisations should be able to demonstrate, not only that they reasonably believed the publication would be in the public interest, but also how and with whom this was established at the time…

Informm had a case comment as did, 5RB.

The saga has not yet concluded.  The defendants have been granted permission to appeal and their appeal will be heard by the Court of Appeal on 12 April 2022.

3. Hijazi v Yaxley-Lennon[2021] EWHC 2008 (QB)

A case concerning a short altercation between two pupils on the playing field of Almondbury Community School in Huddersfield. A video was taken of the incident which subsequently “went viral”, just after the perpetrator of the altercation was expelled from school. He later received a caution for common assault for the incident.

On 28 and 29 November 2018 Mr Yaxley-Lennon used his Facebook account to post two videos of himself giving his opinion on the incident. He suggested, contrary to narratives emerging from media coverage of the altercation, that some of the sympathy toward Mr Hijazi (the claimant) were undeserved as he had committed similar violence.

Both videos were found to be defamatory of Mr Hijazi

In finding for the claimant after the substantive trial, Mr Justice Nicklin stated:

“The Defendant’s allegations against the Claimant were very serious and were published widely. The Defendant has admitted that their publication has caused serious harm to the Claimant’s reputation. The consequences to the Claimant have been particularly severe. Although it was media attention on the Viral Video that first propelled the Claimant (and Bailey McLaren) into the glare of publicity, overwhelmingly that coverage (rightly) portrayed the Claimant as the victim in the Playing Field Incident. The Defendant’s contribution to this media frenzy was a deliberate effort to portray the Claimant as being, far from an innocent victim, but in fact a violent aggressor. Worse, the language used in the First and Second Videos was calculated to inflame the situation. As was entirely predictable, the Claimant then became the target of abuse which ultimately led to him and his family having to leave their home, and the Claimant to have to abandon his education. The Defendant is responsible for this harm, some of the scars of which, particularly the impact on the Claimant’s education, are likely last for many years, if not a lifetime.”

There was an Inforrm Case Comment

4.  Abramovich v Harpercollins Publishers Ltd & Anor [2021] EWHC 3154 (QB)

Chelsea FC owner Roman Abramovich succeeded at a preliminary issue trial on meaning. Mrs Justice Tipples found that all nine of the meanings of allegations relating to Abramovich’s purchase of Chelsea FC “on the directions of President Putin and the Kremlin” were defamatory.

The case concerned a claim of defamation against Catherine Belton and publisher Harper Collins of allegations made in the her book, Putin’s People: How the KGB Took Back Russia and Then Took On The West.

5.   Vardy v Rooney [2021] EWHC 1888 (QB) Inforrm Case Comment

Known as the “Wagatha Christie litigation” this concerned a claim of defamation brought by Rebekah Vardy against Coleen Rooney. The case stems from series of statements published by the defendant on her public Instagram account. Mr Justice Warby, previously found that the statements meant:

Over a period of years Ms Vardy had regularly and frequently abused her status as a trusted follower of Ms Rooney’s personal Instagram account by secretly informing The Sun newspaper of Ms Rooney’s private posts and stories, thereby making public without Ms Rooney’s permission a great deal of information about Ms Rooney, her friends and family which she did not want made public.

This part of the litigation concerns the claimants attempts to strike out and claim summary judgment. A number of paragraphs of the Amended Defence were struck out in relation to allegations of the claimants’ publicity seeking behaviour.

  1. Nettle v Cruse [2021] FCA 93

Sydney based plastic surgeon Dr Nettle refused to operate on Ms Cruse. Cruse posted comments which were highly defamatory of Dr Nettle throughout 2018. This included creating a website in the URL of Dr Nettle’s name. Allegations ranged from failing to keep records confidential to performing unauthorised surgeries. The court found in Dr Nettles favour concluding:

“Dr Nettle has proved that he was defamed by Ms Cruse in four publications in 2018.  Judgment will be entered for Dr Nettle with damages payable by Ms Cruse assessed at $450,000.  Injunctions restraining Ms Cruse from republishing the four impugned publications, or the imputations which have been found to be conveyed by them, will be made permanent.  Ms Cruse will also be ordered to pay Dr Nettle’s costs of the proceeding.”             

  1. Webb v Jones [2021] EWHC 1618 (QB)

A libel claim arising from Facebook postings. The claimant failed to comply with the pre-action protocol and failed to provide particulars of publication context in her pleading until three months after service of the Claim Form.  The defendant’s application for strike out in this case was successful.  The case provides useful guidance on the procedural niceties of conducting a libel claim. Inforrm has a case comment. 

  1. Corbyn v Millett [2021] EWCA Civ 567

The respondent issued defamation proceedings against Jeremy Corbyn in respect of an interview he gave on the Andrew Marr Show in which he had referred to people in the audience as “Zionists” who “don’t understand English irony”.  Saini J held that this made a defamatory allegation of fact.  Mr Corbyn, appealed.  Warby LJ held that the judge did not err in finding that the words ‘disruptive’ and ‘abusive’ were statements of fact?  The appellant was “presenting viewers with a factual narrative”.  He also held that the Judge’s approach to  ‘bare comment’ had been correct and there was no error of law in the finding that imputation were defamatory at common law?

  1. Greenstein v Campaign Against Antisemitism [2021] EWCA Civ 1006

A libel claim against the Campaign Against Antisemitism after the Campaign referred to Greenstein in a series of five articles published on its website. The appeal was against an order striking out particulars of malice and judgment entered into in favour of the Campaign. In upholding the first instance decision, Dingemans LJ reiterated the principles to finding malice from Horrocks v Lowe [1975] AC 135.

  1. Chak v Levant2021 ABQB 946

Rebel Media founder Ezra Levant, was ordered to pay damages of $60,000, following Leonard J finding he defamed a political science professor and former Liberal candidate during a 2014 Sun News broadcast. Levant claimed Farhan Chak “shot up” a nightclub when he was 19 years old.

Quotes from caselaw 6: HRH The Duchess of Sussex v Associated Newspapers Ltd [2021] EWCA Civ 1810- Megan Markle successful in defending appeal by Mail on Sunday

An appeal against the finding for summary judgment for her misuse of private information and copyright claim.

The appellant was granted permission appealed the elements of the case on seven grounds:

i) The new evidence issue: Whether the new evidence provided by each of the
parties should be admitted.

ii) The nature of the attack issue: Whether the judge mistakenly failed to
recognise the significance and importance of the People Article’s attack on Mr
Markle.

iii) The reasonable expectation of privacy issue: Whether the judge adopted a
flawed analysis of the factors undermining the Duchess’s alleged reasonable
expectation of privacy.

iv) The appropriate test issue: Whether the judge wrongly stated the test, by
suggesting that the defendant had to justify an interference with the claimant’s
right of privacy, when the proper approach was to balance the competing article 8 and 10 rights.

v) The right of reply issue: Whether the judge wrongly applied a strict test of
necessity and proportionality to Mr Markle’s right of reply to the People Article.

vi) The public interest/article 10 copyright issue: whether the judge failed
properly to evaluate the interference with article 10, saying that it would be a
rare case in which freedom of expression would outweigh copyright.


vii) The fair dealing copyright issue: whether the judge wrongly relied on his
privacy analysis to reject the fair dealing defence to breach of copyright, bearing
in mind the limited scope of the copyright in the Letter and the wide scope of
the concept of reporting current events.

The Sir Jeoffery Vos decided against the defendant on all grounds dismissing the appeal, in a unanimous judgment, stating summarily:

Essentially, whilst it might have been proportionate to disclose and publish a very small part of the Letter to rebut inaccuracies in the People Article, it was not necessary to deploy half the contents of the Letter as Associated Newspapers did. As the Articles themselves demonstrate, and as the judge found, the primary purpose of the Articles was not to publish Mr Markle’s responses to the inaccurate allegations against him in the People Article. The true purpose of the publication was, as the first 4 lines of the Articles said: to reveal for the first time [to the world] the “[t]he full content of a sensational letter written by [the Duchess] to her estranged father shortly after her wedding”. The contents of the Letter were private when it was written and when it was published, even if the claimant, it now appears, realised that her father might leak its contents to the media.

p.106

Quotes from caselaw 5: Lloyd v Google LLC [2021] UKSC 50 – no one size fits all claim available in data protection “Safari Workaround” class action

In one of the most significant privacy law judgments of the year the UK Supreme Court considered whether a class action for breach of s4(4) Data Protection Act 1998 (“DPA”) could be brought against Google of its obligations as a data controller for its application of the “Safari Workaround”. The claim for compensation was made under s.13 DPA 1998.

The amount claimed per person advanced in the letter of claim was £750. Collectively, with the number of people impacted by the processing, the potential liability of Google was estimated to exceed £3bn.

“The claim alleges that, for several months in late 2011 and early 2012,
Google secretly tracked the internet activity of millions of Apple iPhone users and used the data collected in this way for commercial purposes without the users’ knowledge or consent.”

Lord Leggatt at p.1

The class action claim was brought under rule 19.6 of the Civil Procedure Rules.

Lord Leggatt handed down the unanimous judgement in favour of the appellant Google LLC:

“the claim has no real prospect of
success. That in turn is because, in the way the claim has been framed in order to try to bring it as a representative action, the claimant seeks damages under section 13 of the DPA 1998 for each individual member of the represented class without attempting to show that any wrongful use was made by Google of personal data relating to that
individual or that the individual suffered any material damage or distress as a result of a breach of the requirements of the Act by Google.”

At p.159

It should be noted that the claim was brought under the Data Protection Act 1998 and not under the GDPR.

See the full judgement here. The Panopticon Blog has an excellent summary.