TPP number 30 on Feedspot – Top 35 Privacy Websites and Blogs

We are delighted to be ranked 30 out of Feedspots top 35 blogs. TPP was ranked alongside law firms and authoritative blogs on privacy law.

According to Feedspot sites are ranked “by traffic rank, social media followers, domain authority & freshness.” The full list can be found here and is a must read for anyone interested in privacy law matters.

TPP re-published by the The Student Lawyer: Use of facial recognition software in school lunch queues in North Ayrshire

TPP is pleased to announce that the article that appeared on this site analysing use of facial recognition software in schools in North Ayrshire has been republished by the Student Lawyer.

The Student Lawyer is a go-to legal news and blogging site for law students. You can find the article here.

Citation: 5RB: European Court of Human Rights upholds Article 8 privacy breach in relation to reputation of a dead person

In a case builds upon pre-existing caselaw on the rights of those who are deceased the European Court of Human Rights has found an article 8 breach in relation to news articles posted about a deceased Roman Catholic Priest.

ML v Slovakia 34159/17 concerned a number of articles published by three Slovakian newspapers about the historic sex offence convictions of the claimants son.

The Court found that the articles were inaccurate and sensationalist citing that: “However, it follows from what has been said above that the domestic courts failed to carry out a balancing exercise between the applicant’s right to private life and the newspaper publishers’ freedom of expression in conformity with the criteria laid down in the Court’s case-law.

Concluding the Courts stated, applying Article 8:

“…dealing appropriately with the dead out of respect for the feelings of the deceased’s relatives falls within the scope of Article 8 of the Convention”.

Furthermore the Court stated a clear and concise view on the journalistic integrity of the reporting: “Although the journalists must be afforded some degree of exaggeration or even provocation, the Court considers that the frivolous and unverified statements about the applicants sons private life must be taken to have gone beyond the limits of responsible journalism” -p.47

5RB has an excellent case comment.

ICO intervenes in nine schools in North Ayrshire which are using facial recognition software to scan faces of pupils in lunch queues

According to the Financial Times and Guardian the ICO is set to intervene in nine schools in North Ayrshire following the discovery that pupils faces were being scanned in lunch queues to take payments.

The ICO commented: 

“Data protection law provides additional protections for children, and organisations need to carefully consider the necessity and proportionality of collecting biometric data before they do so. Organisations should consider using a different approach if the same goal can be achieved in a less intrusive manner. We are aware of the introduction, and will be making inquiries with North Ayrshire council.”

Whilst the company that provides the software argues this a safe way to take payments in the age of covid the question, as the ICO rightly posits, clearly arises as to whether a less invasive method of safely taking payments could be used.

Simple measures such as issuing pupils with lunch cards that they can scan to identify themselves or even with just a unique ID number that could easily be anonymised and aggregated, would just as easily serve this purpose.

Under Article 35 of the GDPR a Data Protection Impact Assessment must be made before this software is used. This would assess whether the use of facial recognition software was a proportionate means for achieving the legitimate aim of securely taking card payments. Aspects such as the retention period of data, storage methods, basis for processing, safeguards and processes for gathering consent must be considered.

Schools should have mechanisms and documentation in place to explain to children the circumstances of this data collection, storage and their rights under the GDPR, including an option to opt out of the data collection. 

Under the GDPR the age where children can consent to the sharing of their personal data in England and Wales is as low as is permissible- thirteen. In Scotland, the location of the schools, the age is lower- at twelve years of age.

Interestingly, North Ayrshire Council indicated that 97% of pupils or their parents had given consent to this process. The Council has temporarily paused the rollout of the software given the ICO’s intervention.

CBR Cumminghams, a company that provides the software, stated that their cameras check pupils faces against encrypted templates, an thus operated differently to “live” facial recognition used by the police to scan for criminal activities, that was challenged in the Bridges case.

A Principal of one of the schools, David Waugh, commented:

“The combined fingerprint and facial recognition system was part of an upgrade to the catering cashless system, so that the time it takes to serve students is reduced, thus giving a better dining experience. However, we will not be using the facial recognition aspect.”

Mischon de Reya has a excellent analysis of these issues, which cover Scotland and are thus outside of TPP’s remit. The BBC also reports on the story.

Quotes from caselaw 3: Fairhurst v Woodard (Case No: G00MK161) – A cautionary tale for neigbours implementing surveillance

I am satisfied that the
extent of range to which these devices can capture audio is well beyond the
range of video that they capture, and in my view cannot be said to be
reasonable for the purpose for which the devices are used by the Defendant,
since the legitimate aim for which they are said to be used, namely crime
prevention, could surely be achieved by something less. A great deal of the
purpose could be achieved without audio at all, as is the case with the bulk
of CCTV systems in use in public places in this country, or by a microphone that only picks up sound within a small diameter of the device.


That finding means that I am satisfied that the processing of such audio
data by the Defendant as data controller is not lawful. The extent of the
range means that personal data may be captured from people who are not
even aware that the device is there, or that it records and processes audio
personal data, or that it can do so from such a distance away, in breach of
the first principle.”

Melissa Clarke HHJ. at p.137

In Fairhurst a neighbour complained that use of several cameras, including a Ring doorbell, amounted to nusiance, harassment and breach of the Data Protection Act 2018.

The claims of harassment and data protection succeeded. It was, in particular, noted that the audio recording capabilities of the devices were much broader in than the video recording capability. As the above quote shows, the extent processing of the audio recording data was such that it was unlawful under data protection laws.

The audio recording capability of the Ring device extended 40-68ft (12-20m).

Amazon released a statement following the finding in the case: “We strongly encourage our customers to respect their neighbours’ privacy and comply with any applicable laws when using their Ring product.”

The case serves as a cautionary tale for those seeking to implement surveillance around their homes that impinge upon their neighbours.

INFORRM has an excellent case comment for interested readers. As does the Guardian.

Quotes from caselaw 2: Sicri v Assocated Newspapers [2020] EWHC 3541 (QB) – Privacy and suspicion by the state

The rationale for the general rule, that an individual has a reasonable expectation of privacy in respect of information that they have come under suspicion by the state, is clear: disclosure of such information is likely to have a seriously harmful impact on the person’s reputation, and thus their private life.

Warby J. at p.55

The Sicri case concerned the publication of an article by the Mail Online following the arrest of a man for having a connection with Manchester Arena suicide bomber Salman Abedi. The Mail Online did not remove the article after the claimants’ release and divulged his name via an alternative spelling, address and other identifiable details.  The claimant was successful and awarded £83,000 in damages as he had a reasonable expectation of privacy in respect of his identity remaining private when his arrest was reported. It should be noted that this reasonable expectation was assessed at pre-charge stage.

The claimant had a right to expect that the defendant would not publish his identity as the 23-year-old man arrested on suspicion of involvement in the Manchester Arena bombing. By 12:47 on 29 May 2017, the defendant had violated that right; it had no, or no sufficient public interest justification for identifying the claimant. It continued to do so. Later, another publisher did the same or similar. But the claimant’s right to have the defendant respect his privacy was not defeated or significantly weakened by the fact that others failed to do so. He is entitled to compensation. The appropriate sum is £83,000 in general and special damages.

Warby J. at 190

This is part of our new “quotes from caselaw” series, looking to bring you short snippets from leading judgments on privacy, which highlight its importance and development.

Privacy Law Monthly Round Up – September 2021

Headlines

Ben and Deborah Stokes’ privacy claim against The Sun for the highly intrusive article detailing traumatic events in the Stokes’ family past was settled on 30 August 2021, with the newspaper agreeing to publish an apology and pay substantial damages. Paul Wragg wrote about The Sun’s “nonsensical” defence for the Inforrm Blog, concluding that the only party spared the anguish of trial was the newspapers’ defence team.

Government and General legislative developments

The controversial Police, Crime, Sentencing and Courts Bill had its second reading in the House of Lords this month. The Bill is notorious for its proposed restrictions on peaceful protest, which critics have predicted will have a discriminatory impact and breach the rights to freedom of expression and assembly. Broadened police powers would also enable the extraction of more information from mobile phones.

The Age Appropriate Design Code (aka the “Children’s Code”) entered into force on 2 September 2021 following a one year transition period. The Children’s Code explains to businesses how the UK GDPR, Data Protection Act and Privacy and Electronic Communications Regulations apply to the design and delivery of Information Society Services (“ISS”) – i.e social media, educational and gaming platforms – apply to children. The Children’s Code is the first of its kind worldwide, and has been welcomed by many as a positive development for keeping children safe online. The 15 standards that the Code sets can be found here.

Sticking with child safety online, Home Secretary Priti Patel launched a Safety Tech Challenge fund at the G7 meeting start of this month. Five applicants will be awarded up to £85,000 each to develop new technologies that enable to detection of child sexual abuse material online, without breaking end-to-end encryption.

The UK Government has launched a public consultation on data protection legislation reform following Brexit entitled Data: A new direction. The consultation is open until 19 November. Following the end of the Brexit transition period, the UK’s data protection regime, which had derived from the EU framework, will be transposed into domestic law known as the UK GDPR. The Government is seeking to use this opportunity to make some changes to the current regime. The Hawtalk Blog discusses how some of these proposals are unethical and unsafe. Further discussion can be found on the Panopticon Blog and the Data Protection report

Data Privacy and Data Protection

Cressida Dick, the Metropolitan Police Commissioner, has accused tech giants of undermining terrorist prevention efforts by virtue of their focus on end-to-end encryption. Writing in The Telegraph on the twentieth anniversary of the 9/11 attacks, she said that it is “impossible in some cases” for the police to fulfil their role to protect the public. Given the pressure on tech giants to ensure users’ privacy, companies are unlikely to reshape their platforms to facilitate more extensive monitoring.

Apple has delayed its plan to scan its users’ iCloud images for child sexual abuse material. The proposed detection technology would compare images before they are uploaded to iCloud against unique “digital fingerprints” of known child pornographic material maintained by the National Centre for Missing and Exploited Children. The plan was criticised by privacy groups because it involved using an individual’s own device to check if they were potentially engaged in criminal activity.

Surveillance

The Metropolitan Police have invested £3 million into new facial recognition technologies (FRT) that will greatly increase surveillance capabilities in the capital. The expansion of the Met’s technology will enable it to process historic images from CCTV feeds, social media and other sources in order to track down suspects. Critics argue that such FRT encroaches on privacy by “turning back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years.” There is also concern that FRT can exacerbate existing racial discrimination in the criminal justice system. The UK’s Surveillance Camera Commissioner (SCC), Professor Fraser Sampson, has acknowledged that some FRT “are so ethically fraught” that it may only be appropriate to carry them out under license in the future.

NGO’s

Big Brother Watch published an opinion piece warning that the imposition of vaccine passports could reorganise Britain into a two-tier, checkpoint society. The article responds to the Scottish Parliament’s vote in favour of vaccine passports earlier this month. Wales has since followed Scotland and announced mandatory vaccination and COVID status check schemes. The English government has not yet committed to such a regime. The ICO has emphasised that data protection laws will not stand in the way of mandatory vaccination and COVID status checks, but rather facilitate responsible sharing of personal data where it is necessary to protect public health. 

Privacy International has considered how data-intensive systems and surveillance infrastructure, developed by national and foreign actors, in Afghanistan as part of developmental and counter-terrorism measures will fare under the Taliban regime.

From the regulator

ICO

The ICO has announced two fines this month;

  • A total of £495,000 was imposed against We Buy Any Car, Saga, and Sports Direct for sending more than 354 million “frustrating and intrusive” nuisance messages between them. None of the companies had permission to send recipients marketing emails or texts, making their behaviour illegal;
  • The Glasgow-based company DialADeal Scotland Ltd was fined £150,000 for the making of more than 500,000 nuisance marketing calls to recipients who had not given their permission to receive them.

The ICO has also released a communiqué from a meeting on data protection and privacy held by the G7 authorities at the start of the month. The meeting is closely aligned with the Roadmap for Cooperation on Data Free Flow with Trust announced by G7 Digital and Technology Ministers on 28 April 2021.

IPSO

IPSO has published a number of privacy rulings and resolutions;

IMPRESS

There were no IMPRESS rulings relating to privacy this month.

Cases

The Inforrm Blog has published an article detailing the continued decline in privacy injunction applications in England and Wales for 2021. There were only three applications in the first six months of the year, down from ten in 2020. All three applications were successful. Only 4% of the new issued cases on the Media and Communications List related to misuse of private information or breach of privacy.

No judgements relating to privacy have been handed down this month.


Written by Colette Allen

Colette Allen has hosted “Newscast’” on The Media Law Podcast with Dr Thomas Bennett and Professor Paul Wragg since 2018. She has recently finished the BTC at The Inns of Court College of Advocacy and will be starting a MSc in the Social Sciences of the Internet at the University of Oxford in October 2021.

Healthcare data and data protection in the time of coronavirus – Olivia Wint

The processing of special category personal data (including health data e.g. vaccination status, blood type, health conditions etc) was a common topic before the COVID-19 pandemic (the “pandemic”), with various resources published that explored this topic.

For example, the European Data Protection Board (“EDPB”) published an adopted opinion on the interplay between the Clinical Trials Regulation and the General Data Protection Regulation* (“GDPR”) (23January 2019), the Information Commissioner’s Office (“ICO”) posted a blog on why special category personal data needs to be handled even more carefully (14 November 2019) and the ICO published guidance on the lawful basis for processing special category data compliance with the GDPR (November 2019).

The pandemic has brought about a number of data protection considerations, all of which were already in existence but exacerbated by the pandemic (employee monitoring, contact tracing, workforce shift from office to home etc.) One that is more prevalent than ever before is the processing of health data, this piece aims to cover some key data protection themes and practical insights into the processing of health data.  

Health data, a subset of special category personal data by its very nature comes with an increased risk profile.  When processing this data type, not only are there legislative data protection requirements, the expectation of good clinical governance practices but also regulatory body considerations too.                                                           

For example, the NHS Care Quality Commission have in place a code of practice on confidential personal information, the NHS Health Research Authority have in place GDPR guidance specifically for researchers and study coordinators and technical guidance for those responsible for information governance within their organisation and the NHS more generally, has in place it’s Data Security and Protection Toolkit (the “Toolkit”). The Toolkit is an online self-assessment tool that enables organisations to measure and publish their performance against the National Data Guardian’s ten data security standards. The Toolkit covers records management and retention, training and awareness, system vulnerability management and crisis management to name a few.                                                                  

The above is all on a national level (UK), on an international level, there are data protection laws which specifically cover health data such as HIPAA in the US, the Patient Data Protection Act in Germany, and various provincial health data privacy laws in Canada such as the Health Information Act in Alberta.

Whilst the previous paragraph highlights the complexities of processing health data whether on a national and international level in comparison to other data types, there are a number of mitigations that organisations can put in place to adequately reduce the risks associated with processing this type of data. Mitigations such as Data Protection Impact Assessments (“DPIAs”), updated privacy notices and appropriate security measures amongst other things should all be considered.

Many organisations that never historically processed health data may now do so as a result of the pandemic…

Covering your bases

The first base that must be covered when processing data is ensuring that an appropriate legal basis has been established for each data processing activity, so for example if health data is processed for employee monitoring and research, a legal basis for both of these activities will need to be established. Legal bases can include for the performance of a contract, for legitimate interests** of the organisation and/or in order to perform a legal obligation.  Where processing of health data is concerned an additional category under Article 9 of the UK GDPR must be met. In the healthcare context, applicable additional categories may include explicit consent, health or social care purposes, public health purposes and/or archiving research and statistical purposes. 

Many organisations that never historically processed health data may now do as a result of the pandemic or alternatively organisations that processed health data pre-pandemic may now be doing so in larger amounts, organisations that fit either side of the coin should also assess the extent to which their privacy notice(s) have been updated and/or need to be updated in order to make data subjects aware any applicable data processing changes and to comply with transparency obligations.

Next, large scale processing of health data may pose a ‘high risk to the rights and freedoms of natural persons’ and in such cases, will trigger the requirement of a DPIA. In order for a DPIA to have value, it is important for organisations to ensure that the DPIA is assessed and considered early on to ensure privacy by design and default is incumbent of any system or processing activity.

A DPIA will assess the likelihood and severity of harm related to the processing activity in question and should the DPIA identify a high risk with no available mitigations, consultation with the ICO will be needed. The ICO has set out a 9-step lifecycle for the DPIA, all of which should be considered before any data processing has taken place:

  1. Identify a need for a DPIA;
  2. Describe the processing;
  3. Consider consultation;
  4. Assess necessity and proportionality;
  5. Identify and assess risks;
  6. Identify measures to mitigate risk;
  7. Sign off and record outcomes;
  8. Integrate outcomes into plan; and
  9. Keep under review.

Internally, organisations should have appropriate technical and organisational measures in place which reflects the risk presented. In relation to technical measures, appropriate internal controls and security measures should be utilised. Organisations may wish to consider a myriad and combination of controls to ensure that health data has the best level of protection, this may include end to end encryption for data both in transit and at rest, role-based access within organisations and the adoption and accreditation of industry recognised security standards such as ISO 27001.

In respect of organisational measures, it may be apt for training and awareness sessions to be implemented with tailored training administered to employees that will doing data processing activities and a robust policy suite in place which covers key circumstances such as data breaches and business continuity.

Data sharing

A specific data processing activity that may be utilised more in the wake of the pandemic is that of data sharing between organisations for information and research purposes. In the England, the soon to be implemented GP Data Sharing Scheme aims to improve and create a new framework for creating a central NHS digital database from GP records and the UK’s Department of Health and Social Care (“DHSC”) has recently published a draft policy paper titled ‘Data saves lives: reshaping health and social care with data’. The policy covers the aspiration of the DHSC to introduce new legislation as part of the Health and Care Bill (currently at Committee stage) to encourage data sharing between private health providers and the NHS and have more guard rails around the sharing of data generally through mandating standards for how data is collected and stored.

With data sharing as evidenced by the above, is something that will be advocated for and welcomed in due course, it is important that organisations have in place the appropriate contractual and practical measures to protect data as data in motion is when it is most vulnerable. Contractual measures include ensuring data sharing and/or transfer agreements are in place which cover all necessary contractual provisions and provide adequate assurances as to the data sharing/transfer arrangements. The NHSX has published a template Data Sharing Agreement which has been labelled as suitable for use by all health and care organisations and includes risk management, legal basis and confidentiality and privacy provisions amongst other things. Practical measures include conducting due diligence checks on all organisations which may be in receipt of data as part of the data sharing process (including third parties) and anonymising/ pseudonymising data. The ICO has put in place a comprehensive data sharing checklist which invites organisations to consider data minimisation, accountability and data subject rights.

The pandemic has changed the world that we knew it in more ways than one and in the context of processing of health data, what seems to be certain is that the processing of health data is on the rise. As such, organisations should continue to monitor guidance and developments in this area and ensure data protection principles are at the core of all data processing activities as a first port of call.

* EDPB guidelines are no longer directly relevant to the UK data protection regime and are not binding under the UK regime.

** A legitimate interest assessment should be considered when relying on legitimate interest as a lawful basis.

Olivia Wint is a seasoned data protection professional, with over five years experience in this area. Olivia has worked in a range of sectors including local authority, third sector, start-ups and the Big 4 advising on all aspects of data protection compliance.

Cricketer Ben Stokes and mother Deborah Stokes achieve settlement in privacy case against the Sun newspaper, securing rare unreserved apology

Following the publication of an article in 2019 in the Sun newspaper concerning a family matter before the cricketer was born, Ben Stokes and his mother have achieved a settlement from the Sun newspaper.

Mother of Ben Stokes, Deborah Stokes commented: “The decision to publish this article was a decision to expose, and to profit from exposing, intensely private and painful matters within our family. The suffering caused to our family by the publication of this article is something we cannot forgive.

“Ben and I can take no pleasure in concluding this settlement with the Sun. We can only hope that our actions in holding the paper to account will leave a lasting mark, and one that will contribute to prevent other families from having to suffer the same pain as was inflicted on our family by this article.”

The family were represented by Brabners LLP. Paul Lunt, solicitor to Ben and Deborah Stokes and Head of Litigation, said “The Sun has apologised to Ben and Deborah. The paper has accepted that the article ought never to have seen the light of day. The apology to our clients acknowledges the great distress caused to Ben, Deborah and their family by what was a gross intrusion – and exploitation – of their privacy. Substantial damages have also been paid, as well as payment of legal costs.”

See the Brabners LLP press release here.

The Sun stated: “On 17 September 2019 we published a story titled ‘Tragedy that Haunts Stokes’ Family’ which described a tragic incident that had occurred to Deborah Stokes, the mother of Ben Stokes, in New Zealand in 1988. The article caused great distress to the Stokes family, and especially to Deborah Stokes. We should not have published the article. We apologise to Deborah and Ben Stokes. We have agreed to pay them damages and their legal costs.”

Coverage of the settlement can be found in the Guardian, Press Gazette and BBC Sport, amongst others.