The Online Safety Bill: Everything in Moderation? – Naomi Kilcoyne – Parts III, IV and V

PART III: CRITICISM

In a rare show of national unity, disapproval of the OSB has spanned both ends of the political spectrum. Alongside criticism from the Labour culture minister, Conservative politicians have also weighed in on the ‘legal but harmful’ debate. Thinktanks and non-profit groups have likewise been apprehensive.

Perhaps most headline-grabbing was the censure of the former Supreme Court judge, Lord Sumption, who denounced the OSB in an article in The Spectator, and subsequently on the Law Pod UK podcast.

From this range of commentary, five common criticisms may be collated:

(1) The OSB’s inclusion of merely ‘harmful’ material is overly broad and worryingly vague.

This is the central target of Lord Sumption’s criticism. While illegal content is obviously unacceptable and capable of precise definition, he argues, ‘legal but harmful’ content establishes a vague and subjective test capable of near limitless range. Moreover, the restriction of entirely legal speech, at least as regards adults, is difficult to justify as anything other than state paternalism.

Others have pointed out the discrepancy between offline duties of care (focused on the risk of physical injury), and the broader, amorphous online duty of care envisaged by the OSB. As one legal commentator put it, ‘speech is not a tripping hazard’.

Notwithstanding the Government’s revisions to the draft Bill, it is difficult to dispute these arguments. The Bill’s simplified definition of ‘harmful’ only displaces the definitional issue: what constitutes a ‘material’ risk, or ‘significant’ harm? How many potential victims is an ‘appreciable’ number? The lack of a clear and stable legal basis for interference with lawful speech raises obvious rule-of-law concerns.

(2) The designation of ‘harmful’ content in secondary legislation hands disproportionate control to the executive.

In extending the affirmative resolution procedure (ARP) to the regulations made by the Secretary of State, the Government explicitly acknowledged the need to ensure that Parliament can scrutinise the designation of future harms. The Response also justifies the use of secondary legislation, on the basis that the list of harms can be updated as technology continues to advance (at [12], [55]-[56]).

With the rise of deepfake technology, and emergence of ‘the metaverse’, the need for flexibility in the face of emerging online harms is apparent. Equally, executive efforts to maximise legislative scrutiny are to be welcomed, particularly given concerns over the increased use of secondary legislation following Brexit and the Covid-19 pandemic.

However, whether the ARP secures meaningful parliamentary oversight is debatable. A recent House of Lords Committee Report noted that, even under the ARP, Parliament is unable to amend delegated legislation. This creates an ‘all or nothing’ situation in which Parliament is highly unlikely to reject the instrument in its entirety (at [27]-[29]). If parliamentary scrutiny of the regulations is thus neutered, Lord Sumption may be correct that ‘these are powers which no public officer ought to have’.

(3) The discretion afforded to the regulator, Ofcom, threatens a lack of democratic procedure for content moderation.

A related concern surrounds Ofcom, to whom oversight and enforcement of the regulatory framework is delegated. In this sense, the Bill presents a double discretionary risk: from both executive and regulator.

The Government’s Response reiterated that the scope of Ofcom’s powers is clearly stated in the Bill; furthermore, Ofcom will publish detailed enforcement guidelines (at [81]-[83]). In July 2022, Ofcom duly released their ‘roadmap to regulation’, confirming their commitment to uphold freedom of expression online.

That said, since the OSB’s inception, the independence of the regulator has been called into question. The government’s efforts to secure the appointment of former Daily Mail editor Paul Dacre as chair of Ofcom – despite Dacre having initially been rejected as inappropriate for the role – have drawn accusations of executive favouritism. While Dacre has now withdrawn from the running, the Scottish and Welsh devolved legislatures signed a joint letter citing ‘profound concerns’ regarding the ‘perceived lack of impartiality and transparency of the current appointment processes at Ofcom’.

Governmental discretion to define what constitutes ‘acceptable speech’ is thus compounded by the possibility that such a government might also seek to influence the independent regulator.

(4) Service providers’ obligation to fulfil their statutory duties risks the over-removal of legal content.

The risk that the OSB may exercise a ‘chilling effect’ operates on two levels: deliberate and accidental.

The former is best summarised by Lord Sumption, writing in The Spectator: ‘if in doubt, cut it out’. There is concern that, faced with criminal sanctions or crippling fines (up to £18 million, or 10% of global revenue), service providers will simply take the path of least resistance and remove content which, although legal, may be harmful. Given the OSB’s broad definition of ‘harm’, and underwhelming provisions for user redress, companies may well feel that the risks of non-compliance outweigh those of over-moderation.

This concern is amplified by the threat of accidental over-removal, as a result of internet algorithms. As the non-profit Electronic Frontier Foundation explains, algorithms are even worse than human regulators at differentiating between online vitriol and online debate: context, tone and irony are all left by the wayside. By using a digital sledgehammer to crack a human nut, online platforms may inadvertently remove content whose nuances cannot be detected.

Moreover, as Lord Sumption notes in his article, the OSB’s instruction to ‘have regard to’ the importance of freedom of expression – even if heeded by online platforms anxious to avoid liability – will hardly register with an internet algorithm programmed to be over-cautious.

(5) The overall impact of the OSB threatens to restrict free speech.

Lord Sumption writes that the UK’s traditional hostility to state interference with free speech stems not only from an attachment to individual liberty, but also from the awareness that such interference inhibits growth. As Lord Sumption argues, the confrontation of opposing views is essential to the development of knowledge: the alternative is that only ‘the anodyne, the uncontroversial, the conventional and the officially approved’ will be allowed past the filter.

This vision of the future echoes that foreseen by the Conservative backbencher David Davis, for whom the OSB risks producing the ‘biggest accidental curtailment of free speech in modern history’. Davis denounces the OSB as no more than a ‘Censor’s Charter’. In July 2022, meanwhile, the Conservative leadership candidate Kemi Badenoch accused the OSB of ‘legislating for hurt feelings’, declaring her intention to ‘ensure the bill doesn’t overreach’.

However, other commentators reject the supposed mutual exclusivity of content moderation and dialogue, arguing that the former is essential to facilitate the latter. Without regulation to prevent online abuse, they contend, the ‘free speech absolutism’ endorsed by key players like Musk can in practice restrict others’ freedom of speech. Provided it is done correctly, ‘content moderation is not the same as censorship’.

However, ‘correct’ is surely the operative word. Criticisms (1) to (4) offer compelling evidence that the revised OSB, despite its protestations to the contrary, ultimately fails to strike the right balance.

PART IV: CONSEQUENCES

It is difficult to deny that, as a legislative proposal, the OSB raises serious rule-of-law concerns.

Most, if not all, conceptions of the rule of law prioritise clear, stable and prospective laws, by which citizens may plan their conduct. In this way, the rule of law serves to protect the individual against arbitrary interference by the state.

As one of the OSB’s most prolific critics puts it, the OSB thus falls down in both respects: impermissible vagueness, and excessive discretion.

Before concluding, several other rule-of-law shortcomings of the OSB invite brief comment:

One concern follows from the DSA, which (as described above) adopts a different regulatory framework to the OSB. The rule of law demands that citizens should know by which laws they are governed, so they may act accordingly. The potential for jurisdictional confusion is thus alarming. Given the borderless nature of the Internet, it is not difficult to envisage issues arising over the applicable regulatory scheme – e.g. where a user resides in the UK, but has used a VPN to register their location elsewhere (‘geo-spoofing’).

A second concern is the OSB’s mechanisms for user redress. Access to justice has been recognised as a fundamental component of the rule of law. Under the revised Bill, providers must make clear that users have a right of action for breach of contract, if their content is removed contrary to that provider’s terms of service (clause 19(4)). However, whether services will be obliged to notify users that their content has been removed is left to Ofcom’s discretion – indeed, the Government’s Response suggests such a duty ‘will not necessarily be proportionate in all cases’ (at [99]). But what good is a right of action, if a user is not notified that the factual prerequisite for such an action has occurred?

Finally, we might consider whether the OSB is truly ‘future-proofed’. We live in an era where democratic backsliding is not only prevalent globally, but possibly present in the UK. It is thus essential to interrogate the adequacy of legislation in the face of prospective future governments which may not respect the rule of law or constitutional conventions – for example, by openly endorsing partisan candidates to chair an independent regulator…

PART V: CONCLUSION

The above analysis indicates that the revised OSB overshoots. Its declared scope exposes too broad a swathe of online content to regulation by a delegated and discretionary regulator – such that, troublingly, nearly everything online is ‘in’ (or under) moderation.

The Government should perhaps return to the original aphorism: everything in moderation. Targeting unlawful speech online is without question justified, particularly in today’s polarised and post-truth digital society. Nevertheless, the impulse to moderate must be tempered by fundamental human rights and rule of law considerations.

Attempting too much risks achieving too much – and producing a thoroughly immoderate result.

An appended Part VI details the most recent updates to the OSB. Please click here for Parts I (Content) and II (Context).

Naomi Kilcoyne is a Visiting Lecturer in Public Law at City University, having completed her GDL there in 2021-22. She has a particular interest in the interplay between public and private law.

2 thoughts on “The Online Safety Bill: Everything in Moderation? – Naomi Kilcoyne – Parts III, IV and V

Leave a comment