ICO launches consultation on the Draft Journalism Code of Practice

The ICO’s consultation on its Draft Journalism Code of Practice has begun.

Be sure to have your say- the deadline to submit responses is 22 January 2022.

The Code covers privacy safeguards among many other topics. In particular, it covers the journalism exemption under the Data Protection Act 2018 and its broad exemption that disapplies requirements to holding and processing data.

Journalism should be balanced with other rights that are also
fundamentally important to democracy, such as data protection and the
right to privacy.

at p.4

The Code substantively addresses the safeguarding of journalism under the exemption, briefly touching on balancing a free press against privacy rights before going on to discuss how this balance is struck under data protection laws:

Why is it important to balance journalism and privacy?


It is widely accepted that a free press, especially a diverse press, is a
fundamental component of a democracy.

It is associated with strong and
important public benefits worthy of special protection. This in itself is a public
interest.

Most obviously, a free press plays a vital role in the free flow of

communications in a democracy. It increases knowledge, informs debates
and helps citizens to participate more fully in society. All forms of journalistic
content can perform this crucial role, from day-to-day stories about local
events to celebrity gossip to major public interest investigations.

A free press is also regarded as a public watch-dog. It acts as an important
check on political and other forms of power, and in particular abuses of
power. In this way, it helps citizens to hold the powerful to account.

However, the right to freedom of expression and information should be
balanced with other rights that are necessary in a democratic society, such
as the right to privacy. The public interest in individual freedom of expression
is itself an aspect of a broader public interest in the autonomy, integrity and
dignity of individuals.

The influence and power of the press in society, and the reach of the

internet, means that it is particularly important to balance journalism and
people’s right to privacy.

This code provides guidance about balancing these two important rights by
helping you to understand what data protection law requires and how to
comply with these requirements effectively.

at p.25

ICO intervenes in nine schools in North Ayrshire which are using facial recognition software to scan faces of pupils in lunch queues

According to the Financial Times and Guardian the ICO is set to intervene in nine schools in North Ayrshire following the discovery that pupils faces were being scanned in lunch queues to take payments.

The ICO commented: 

“Data protection law provides additional protections for children, and organisations need to carefully consider the necessity and proportionality of collecting biometric data before they do so. Organisations should consider using a different approach if the same goal can be achieved in a less intrusive manner. We are aware of the introduction, and will be making inquiries with North Ayrshire council.”

Whilst the company that provides the software argues this a safe way to take payments in the age of covid the question, as the ICO rightly posits, clearly arises as to whether a less invasive method of safely taking payments could be used.

Simple measures such as issuing pupils with lunch cards that they can scan to identify themselves or even with just a unique ID number that could easily be anonymised and aggregated, would just as easily serve this purpose.

Under Article 35 of the GDPR a Data Protection Impact Assessment must be made before this software is used. This would assess whether the use of facial recognition software was a proportionate means for achieving the legitimate aim of securely taking card payments. Aspects such as the retention period of data, storage methods, basis for processing, safeguards and processes for gathering consent must be considered.

Schools should have mechanisms and documentation in place to explain to children the circumstances of this data collection, storage and their rights under the GDPR, including an option to opt out of the data collection. 

Under the GDPR the age where children can consent to the sharing of their personal data in England and Wales is as low as is permissible- thirteen. In Scotland, the location of the schools, the age is lower- at twelve years of age.

Interestingly, North Ayrshire Council indicated that 97% of pupils or their parents had given consent to this process. The Council has temporarily paused the rollout of the software given the ICO’s intervention.

CBR Cumminghams, a company that provides the software, stated that their cameras check pupils faces against encrypted templates, an thus operated differently to “live” facial recognition used by the police to scan for criminal activities, that was challenged in the Bridges case.

A Principal of one of the schools, David Waugh, commented:

“The combined fingerprint and facial recognition system was part of an upgrade to the catering cashless system, so that the time it takes to serve students is reduced, thus giving a better dining experience. However, we will not be using the facial recognition aspect.”

Mischon de Reya has a excellent analysis of these issues, which cover Scotland and are thus outside of TPP’s remit. The BBC also reports on the story.

Quotes from caselaw 3: Fairhurst v Woodard (Case No: G00MK161) – A cautionary tale for neigbours implementing surveillance

I am satisfied that the
extent of range to which these devices can capture audio is well beyond the
range of video that they capture, and in my view cannot be said to be
reasonable for the purpose for which the devices are used by the Defendant,
since the legitimate aim for which they are said to be used, namely crime
prevention, could surely be achieved by something less. A great deal of the
purpose could be achieved without audio at all, as is the case with the bulk
of CCTV systems in use in public places in this country, or by a microphone that only picks up sound within a small diameter of the device.


That finding means that I am satisfied that the processing of such audio
data by the Defendant as data controller is not lawful. The extent of the
range means that personal data may be captured from people who are not
even aware that the device is there, or that it records and processes audio
personal data, or that it can do so from such a distance away, in breach of
the first principle.”

Melissa Clarke HHJ. at p.137

In Fairhurst a neighbour complained that use of several cameras, including a Ring doorbell, amounted to nusiance, harassment and breach of the Data Protection Act 2018.

The claims of harassment and data protection succeeded. It was, in particular, noted that the audio recording capabilities of the devices were much broader in than the video recording capability. As the above quote shows, the extent processing of the audio recording data was such that it was unlawful under data protection laws.

The audio recording capability of the Ring device extended 40-68ft (12-20m).

Amazon released a statement following the finding in the case: “We strongly encourage our customers to respect their neighbours’ privacy and comply with any applicable laws when using their Ring product.”

The case serves as a cautionary tale for those seeking to implement surveillance around their homes that impinge upon their neighbours.

INFORRM has an excellent case comment for interested readers. As does the Guardian.

Quotes from caselaw 2: Sicri v Assocated Newspapers [2020] EWHC 3541 (QB) – Privacy and suspicion by the state

The rationale for the general rule, that an individual has a reasonable expectation of privacy in respect of information that they have come under suspicion by the state, is clear: disclosure of such information is likely to have a seriously harmful impact on the person’s reputation, and thus their private life.

Warby J. at p.55

The Sicri case concerned the publication of an article by the Mail Online following the arrest of a man for having a connection with Manchester Arena suicide bomber Salman Abedi. The Mail Online did not remove the article after the claimants’ release and divulged his name via an alternative spelling, address and other identifiable details.  The claimant was successful and awarded £83,000 in damages as he had a reasonable expectation of privacy in respect of his identity remaining private when his arrest was reported. It should be noted that this reasonable expectation was assessed at pre-charge stage.

The claimant had a right to expect that the defendant would not publish his identity as the 23-year-old man arrested on suspicion of involvement in the Manchester Arena bombing. By 12:47 on 29 May 2017, the defendant had violated that right; it had no, or no sufficient public interest justification for identifying the claimant. It continued to do so. Later, another publisher did the same or similar. But the claimant’s right to have the defendant respect his privacy was not defeated or significantly weakened by the fact that others failed to do so. He is entitled to compensation. The appropriate sum is £83,000 in general and special damages.

Warby J. at 190

This is part of our new “quotes from caselaw” series, looking to bring you short snippets from leading judgments on privacy, which highlight its importance and development.

Quotes from caselaw 1: Campbell v MGN [2004] 2 AC 457 – The importance of privacy to liberty

“Privacy lies at the heart of liberty in a modern state. A proper degree of liberty is essential for the well-being and development of an individual”

– Lord Nicholls, Campbell v MGN [2004] 2 AC 457 at [p.12]

This is part of our new “quotes from caselaw” series, looking to bring you short snippets from leading judgments on privacy, which highlight its importance and development.