ICO launches consultation on the Draft Journalism Code of Practice

The ICO’s consultation on its Draft Journalism Code of Practice has begun.

Be sure to have your say- the deadline to submit responses is 22 January 2022.

The Code covers privacy safeguards among many other topics. In particular, it covers the journalism exemption under the Data Protection Act 2018 and its broad exemption that disapplies requirements to holding and processing data.

Journalism should be balanced with other rights that are also
fundamentally important to democracy, such as data protection and the
right to privacy.

at p.4

The Code substantively addresses the safeguarding of journalism under the exemption, briefly touching on balancing a free press against privacy rights before going on to discuss how this balance is struck under data protection laws:

Why is it important to balance journalism and privacy?


It is widely accepted that a free press, especially a diverse press, is a
fundamental component of a democracy.

It is associated with strong and
important public benefits worthy of special protection. This in itself is a public
interest.

Most obviously, a free press plays a vital role in the free flow of

communications in a democracy. It increases knowledge, informs debates
and helps citizens to participate more fully in society. All forms of journalistic
content can perform this crucial role, from day-to-day stories about local
events to celebrity gossip to major public interest investigations.

A free press is also regarded as a public watch-dog. It acts as an important
check on political and other forms of power, and in particular abuses of
power. In this way, it helps citizens to hold the powerful to account.

However, the right to freedom of expression and information should be
balanced with other rights that are necessary in a democratic society, such
as the right to privacy. The public interest in individual freedom of expression
is itself an aspect of a broader public interest in the autonomy, integrity and
dignity of individuals.

The influence and power of the press in society, and the reach of the

internet, means that it is particularly important to balance journalism and
people’s right to privacy.

This code provides guidance about balancing these two important rights by
helping you to understand what data protection law requires and how to
comply with these requirements effectively.

at p.25

Quotes from caselaw 3: Fairhurst v Woodard (Case No: G00MK161) – A cautionary tale for neigbours implementing surveillance

I am satisfied that the
extent of range to which these devices can capture audio is well beyond the
range of video that they capture, and in my view cannot be said to be
reasonable for the purpose for which the devices are used by the Defendant,
since the legitimate aim for which they are said to be used, namely crime
prevention, could surely be achieved by something less. A great deal of the
purpose could be achieved without audio at all, as is the case with the bulk
of CCTV systems in use in public places in this country, or by a microphone that only picks up sound within a small diameter of the device.


That finding means that I am satisfied that the processing of such audio
data by the Defendant as data controller is not lawful. The extent of the
range means that personal data may be captured from people who are not
even aware that the device is there, or that it records and processes audio
personal data, or that it can do so from such a distance away, in breach of
the first principle.”

Melissa Clarke HHJ. at p.137

In Fairhurst a neighbour complained that use of several cameras, including a Ring doorbell, amounted to nusiance, harassment and breach of the Data Protection Act 2018.

The claims of harassment and data protection succeeded. It was, in particular, noted that the audio recording capabilities of the devices were much broader in than the video recording capability. As the above quote shows, the extent processing of the audio recording data was such that it was unlawful under data protection laws.

The audio recording capability of the Ring device extended 40-68ft (12-20m).

Amazon released a statement following the finding in the case: “We strongly encourage our customers to respect their neighbours’ privacy and comply with any applicable laws when using their Ring product.”

The case serves as a cautionary tale for those seeking to implement surveillance around their homes that impinge upon their neighbours.

INFORRM has an excellent case comment for interested readers. As does the Guardian.

Privacy Law Monthly Round Up – September 2021

Headlines

Ben and Deborah Stokes’ privacy claim against The Sun for the highly intrusive article detailing traumatic events in the Stokes’ family past was settled on 30 August 2021, with the newspaper agreeing to publish an apology and pay substantial damages. Paul Wragg wrote about The Sun’s “nonsensical” defence for the Inforrm Blog, concluding that the only party spared the anguish of trial was the newspapers’ defence team.

Government and General legislative developments

The controversial Police, Crime, Sentencing and Courts Bill had its second reading in the House of Lords this month. The Bill is notorious for its proposed restrictions on peaceful protest, which critics have predicted will have a discriminatory impact and breach the rights to freedom of expression and assembly. Broadened police powers would also enable the extraction of more information from mobile phones.

The Age Appropriate Design Code (aka the “Children’s Code”) entered into force on 2 September 2021 following a one year transition period. The Children’s Code explains to businesses how the UK GDPR, Data Protection Act and Privacy and Electronic Communications Regulations apply to the design and delivery of Information Society Services (“ISS”) – i.e social media, educational and gaming platforms – apply to children. The Children’s Code is the first of its kind worldwide, and has been welcomed by many as a positive development for keeping children safe online. The 15 standards that the Code sets can be found here.

Sticking with child safety online, Home Secretary Priti Patel launched a Safety Tech Challenge fund at the G7 meeting start of this month. Five applicants will be awarded up to £85,000 each to develop new technologies that enable to detection of child sexual abuse material online, without breaking end-to-end encryption.

The UK Government has launched a public consultation on data protection legislation reform following Brexit entitled Data: A new direction. The consultation is open until 19 November. Following the end of the Brexit transition period, the UK’s data protection regime, which had derived from the EU framework, will be transposed into domestic law known as the UK GDPR. The Government is seeking to use this opportunity to make some changes to the current regime. The Hawtalk Blog discusses how some of these proposals are unethical and unsafe. Further discussion can be found on the Panopticon Blog and the Data Protection report

Data Privacy and Data Protection

Cressida Dick, the Metropolitan Police Commissioner, has accused tech giants of undermining terrorist prevention efforts by virtue of their focus on end-to-end encryption. Writing in The Telegraph on the twentieth anniversary of the 9/11 attacks, she said that it is “impossible in some cases” for the police to fulfil their role to protect the public. Given the pressure on tech giants to ensure users’ privacy, companies are unlikely to reshape their platforms to facilitate more extensive monitoring.

Apple has delayed its plan to scan its users’ iCloud images for child sexual abuse material. The proposed detection technology would compare images before they are uploaded to iCloud against unique “digital fingerprints” of known child pornographic material maintained by the National Centre for Missing and Exploited Children. The plan was criticised by privacy groups because it involved using an individual’s own device to check if they were potentially engaged in criminal activity.

Surveillance

The Metropolitan Police have invested £3 million into new facial recognition technologies (FRT) that will greatly increase surveillance capabilities in the capital. The expansion of the Met’s technology will enable it to process historic images from CCTV feeds, social media and other sources in order to track down suspects. Critics argue that such FRT encroaches on privacy by “turning back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years.” There is also concern that FRT can exacerbate existing racial discrimination in the criminal justice system. The UK’s Surveillance Camera Commissioner (SCC), Professor Fraser Sampson, has acknowledged that some FRT “are so ethically fraught” that it may only be appropriate to carry them out under license in the future.

NGO’s

Big Brother Watch published an opinion piece warning that the imposition of vaccine passports could reorganise Britain into a two-tier, checkpoint society. The article responds to the Scottish Parliament’s vote in favour of vaccine passports earlier this month. Wales has since followed Scotland and announced mandatory vaccination and COVID status check schemes. The English government has not yet committed to such a regime. The ICO has emphasised that data protection laws will not stand in the way of mandatory vaccination and COVID status checks, but rather facilitate responsible sharing of personal data where it is necessary to protect public health. 

Privacy International has considered how data-intensive systems and surveillance infrastructure, developed by national and foreign actors, in Afghanistan as part of developmental and counter-terrorism measures will fare under the Taliban regime.

From the regulator

ICO

The ICO has announced two fines this month;

  • A total of £495,000 was imposed against We Buy Any Car, Saga, and Sports Direct for sending more than 354 million “frustrating and intrusive” nuisance messages between them. None of the companies had permission to send recipients marketing emails or texts, making their behaviour illegal;
  • The Glasgow-based company DialADeal Scotland Ltd was fined £150,000 for the making of more than 500,000 nuisance marketing calls to recipients who had not given their permission to receive them.

The ICO has also released a communiqué from a meeting on data protection and privacy held by the G7 authorities at the start of the month. The meeting is closely aligned with the Roadmap for Cooperation on Data Free Flow with Trust announced by G7 Digital and Technology Ministers on 28 April 2021.

IPSO

IPSO has published a number of privacy rulings and resolutions;

IMPRESS

There were no IMPRESS rulings relating to privacy this month.

Cases

The Inforrm Blog has published an article detailing the continued decline in privacy injunction applications in England and Wales for 2021. There were only three applications in the first six months of the year, down from ten in 2020. All three applications were successful. Only 4% of the new issued cases on the Media and Communications List related to misuse of private information or breach of privacy.

No judgements relating to privacy have been handed down this month.


Written by Colette Allen

Colette Allen has hosted “Newscast’” on The Media Law Podcast with Dr Thomas Bennett and Professor Paul Wragg since 2018. She has recently finished the BTC at The Inns of Court College of Advocacy and will be starting a MSc in the Social Sciences of the Internet at the University of Oxford in October 2021.

Healthcare data and data protection in the time of coronavirus – Olivia Wint

The processing of special category personal data (including health data e.g. vaccination status, blood type, health conditions etc) was a common topic before the COVID-19 pandemic (the “pandemic”), with various resources published that explored this topic.

For example, the European Data Protection Board (“EDPB”) published an adopted opinion on the interplay between the Clinical Trials Regulation and the General Data Protection Regulation* (“GDPR”) (23January 2019), the Information Commissioner’s Office (“ICO”) posted a blog on why special category personal data needs to be handled even more carefully (14 November 2019) and the ICO published guidance on the lawful basis for processing special category data compliance with the GDPR (November 2019).

The pandemic has brought about a number of data protection considerations, all of which were already in existence but exacerbated by the pandemic (employee monitoring, contact tracing, workforce shift from office to home etc.) One that is more prevalent than ever before is the processing of health data, this piece aims to cover some key data protection themes and practical insights into the processing of health data.  

Health data, a subset of special category personal data by its very nature comes with an increased risk profile.  When processing this data type, not only are there legislative data protection requirements, the expectation of good clinical governance practices but also regulatory body considerations too.                                                           

For example, the NHS Care Quality Commission have in place a code of practice on confidential personal information, the NHS Health Research Authority have in place GDPR guidance specifically for researchers and study coordinators and technical guidance for those responsible for information governance within their organisation and the NHS more generally, has in place it’s Data Security and Protection Toolkit (the “Toolkit”). The Toolkit is an online self-assessment tool that enables organisations to measure and publish their performance against the National Data Guardian’s ten data security standards. The Toolkit covers records management and retention, training and awareness, system vulnerability management and crisis management to name a few.                                                                  

The above is all on a national level (UK), on an international level, there are data protection laws which specifically cover health data such as HIPAA in the US, the Patient Data Protection Act in Germany, and various provincial health data privacy laws in Canada such as the Health Information Act in Alberta.

Whilst the previous paragraph highlights the complexities of processing health data whether on a national and international level in comparison to other data types, there are a number of mitigations that organisations can put in place to adequately reduce the risks associated with processing this type of data. Mitigations such as Data Protection Impact Assessments (“DPIAs”), updated privacy notices and appropriate security measures amongst other things should all be considered.

Many organisations that never historically processed health data may now do so as a result of the pandemic…

Covering your bases

The first base that must be covered when processing data is ensuring that an appropriate legal basis has been established for each data processing activity, so for example if health data is processed for employee monitoring and research, a legal basis for both of these activities will need to be established. Legal bases can include for the performance of a contract, for legitimate interests** of the organisation and/or in order to perform a legal obligation.  Where processing of health data is concerned an additional category under Article 9 of the UK GDPR must be met. In the healthcare context, applicable additional categories may include explicit consent, health or social care purposes, public health purposes and/or archiving research and statistical purposes. 

Many organisations that never historically processed health data may now do as a result of the pandemic or alternatively organisations that processed health data pre-pandemic may now be doing so in larger amounts, organisations that fit either side of the coin should also assess the extent to which their privacy notice(s) have been updated and/or need to be updated in order to make data subjects aware any applicable data processing changes and to comply with transparency obligations.

Next, large scale processing of health data may pose a ‘high risk to the rights and freedoms of natural persons’ and in such cases, will trigger the requirement of a DPIA. In order for a DPIA to have value, it is important for organisations to ensure that the DPIA is assessed and considered early on to ensure privacy by design and default is incumbent of any system or processing activity.

A DPIA will assess the likelihood and severity of harm related to the processing activity in question and should the DPIA identify a high risk with no available mitigations, consultation with the ICO will be needed. The ICO has set out a 9-step lifecycle for the DPIA, all of which should be considered before any data processing has taken place:

  1. Identify a need for a DPIA;
  2. Describe the processing;
  3. Consider consultation;
  4. Assess necessity and proportionality;
  5. Identify and assess risks;
  6. Identify measures to mitigate risk;
  7. Sign off and record outcomes;
  8. Integrate outcomes into plan; and
  9. Keep under review.

Internally, organisations should have appropriate technical and organisational measures in place which reflects the risk presented. In relation to technical measures, appropriate internal controls and security measures should be utilised. Organisations may wish to consider a myriad and combination of controls to ensure that health data has the best level of protection, this may include end to end encryption for data both in transit and at rest, role-based access within organisations and the adoption and accreditation of industry recognised security standards such as ISO 27001.

In respect of organisational measures, it may be apt for training and awareness sessions to be implemented with tailored training administered to employees that will doing data processing activities and a robust policy suite in place which covers key circumstances such as data breaches and business continuity.

Data sharing

A specific data processing activity that may be utilised more in the wake of the pandemic is that of data sharing between organisations for information and research purposes. In the England, the soon to be implemented GP Data Sharing Scheme aims to improve and create a new framework for creating a central NHS digital database from GP records and the UK’s Department of Health and Social Care (“DHSC”) has recently published a draft policy paper titled ‘Data saves lives: reshaping health and social care with data’. The policy covers the aspiration of the DHSC to introduce new legislation as part of the Health and Care Bill (currently at Committee stage) to encourage data sharing between private health providers and the NHS and have more guard rails around the sharing of data generally through mandating standards for how data is collected and stored.

With data sharing as evidenced by the above, is something that will be advocated for and welcomed in due course, it is important that organisations have in place the appropriate contractual and practical measures to protect data as data in motion is when it is most vulnerable. Contractual measures include ensuring data sharing and/or transfer agreements are in place which cover all necessary contractual provisions and provide adequate assurances as to the data sharing/transfer arrangements. The NHSX has published a template Data Sharing Agreement which has been labelled as suitable for use by all health and care organisations and includes risk management, legal basis and confidentiality and privacy provisions amongst other things. Practical measures include conducting due diligence checks on all organisations which may be in receipt of data as part of the data sharing process (including third parties) and anonymising/ pseudonymising data. The ICO has put in place a comprehensive data sharing checklist which invites organisations to consider data minimisation, accountability and data subject rights.

The pandemic has changed the world that we knew it in more ways than one and in the context of processing of health data, what seems to be certain is that the processing of health data is on the rise. As such, organisations should continue to monitor guidance and developments in this area and ensure data protection principles are at the core of all data processing activities as a first port of call.

* EDPB guidelines are no longer directly relevant to the UK data protection regime and are not binding under the UK regime.

** A legitimate interest assessment should be considered when relying on legitimate interest as a lawful basis.

Olivia Wint is a seasoned data protection professional, with over five years experience in this area. Olivia has worked in a range of sectors including local authority, third sector, start-ups and the Big 4 advising on all aspects of data protection compliance.

A look at the European Data Protection Board guidance on supplementary measures – Olivia Wint

Data transfers have been a prominent topic in the data protection world in recent months, with the UK’s recent adequacy decision adding to the conversation on the topic.

On 21 June 2021, the European Data Protection Board (“EDPB”) published the final version of Recommendations on supplementary measures (the “Recommendations”). For context, the first draft Recommendations which were published in November 2020 were prompted as a result of the much-anticipated Schrems II judgment which was handed down in July 2020.

The Schrems II judgment comes after the Schrems I judgment, which in 2015, invalidated the Safe Harbour regime in 2015. The focal point of the Schrems II case concerned the legitimacy of standard contractual clauses (“SCCs”) as a transfer mechanism in respect of cross border data transfers from the EU to the US. Max Schrems, a privacy advocate argued that Facebook Ireland transferring a significant amount of data to the US was not adequate due to the US’ surveillance programmes. Schrems argued that this fundament tally affected his right to ‘privacy, data protection and effective judicial protection’.  Rather unexpectedly, the Court of Justice in the European Union (“CJEU”) declared the invalidity of the privacy shield in this case and whilst SCCs were not invalidated, the CJEU laid down stricter requirements for cross border transfers relying on SCCs, which included additional measures to ensure that cross border transfers have ‘essentially equivalent’ protection to that of the General Data Protection Regulation 2016/ 679 (“GDPR”).

As a result of the Schrems II judgment and the invalidation of the privacy shield, the estimated 5300 signatories to this mechanism now need to seek alternate transfer mechanisms and companies on a transatlantic scale have been forced to re-examine their cross-border transfers. As such EDPB’s Recommendations could not have come sooner for many in the privacy world. 

Based on the Schrems II judgment, supplementary measures are in essence additional safeguards to any of the existing transfer mechanisms as cited in Article 46 GDPR, which include SCCs, binding corporate rules (“BCRs”) and approved code of conducts to name a few with the overarching objective of the supplementary measures to ensure the ‘essentially equivalent’ threshold is met.

The EDPB’s Recommendations, outline six key steps which comprise part of an assessment when deducing the need for supplementary measures:

  1. know your transfers;
  2. identify the transfer mechanism(s) you are relying on;
  3. assess whether the transfer mechanism you are relying on is effective in light of all circumstances of the transfer);
  4. identify and adopt supplementary measures;
  5. take any formal procedural measures; and
  6. re-evaluate at appropriate intervals.

Step 1- know your transfers

Step 1 concerns organisations having a good grasp on their data processing activities, mainly evidenced through data mapping and/or records of processing activities (“ROPAs”). As ROPAs are a direct obligation under the GDPR, in theory for most organisations it will be a case of ensuring that the ROPA accurately reflects any new data processing that has occurred (with the inclusion of any third parties).

Key data protection principles should also be considered for example, lawfulness, fairness and transparency (does the privacy policy make it clear that cross border transfers are taking place?), data minimisation (is the whole data set being transferred or just what is relevant?) and accuracy (have data quality checks been conducted on the data in question?).

The Recommendations stipulate that these activities should be executed before any cross-border transfers are made and highlights the fact that cloud storage access is also deemed to be a transfer too.

Step 2- identify the transfer mechanism(s) you are relying on

There are a number of transfer mechanisms that can be relied on for cross border data transfers, such as SCCs, BCRs, codes of conduct etc and adequacy decisions and this step requires organisations to identify the mechanism that will be used for the transfer.

EDPB has noted for organisations that will be using the adequacy decision as their desired transfer mechanism, the subsequent steps in the Recommendations can be discarded.

N.B. to date, the European Commission has only recognised Andorra, Argentina, Canada (commercial organisations only), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Switzerland, Uruguay and the UK.

Step 3- Assess whether the transfer mechanism you are relying on is effective in light of all circumstances of the transfer

This is a critical part of the assessment and requires organisations to assess/ examine the third country’s legislation and practices to ascertain the extent to which there are limitations which may mean the protection afforded as a result of the cross-border transfer is less than ‘essentially equivalent’. The Recommendations affirm that the scope of the assessment needs to be limited to the legislation and practices relevant to the protection of the specific data you transfer. The legislation and/or practices examined must be publicly available in the first instance, verifiable and reliable.

Key circumstances which may influence the applicable legislation/ and or practices include (but are not limited to):

  • purpose for data transfer (marketing, clinical research etc);
  • sector in which transfer occurs (financial, healthcare etc);
  • categories of personal data transferred (children’s data, health data etc); and
  • format of the data (raw, pseudonymised, anonymised, encrypted at rest and in transit etc).

The assessment should be holistic in nature and cover all relevant parties such as controllers, processors and sub- processors (as identified in Step 1) and should consider the effectiveness of data subject rights in practice.

Examining of legislation and practices is of utmost important in situations when:

  1. legislation in third country does not formally meet EU standards in respect of rights/freedoms and necessity and proportionality;
  2. legislation in third country may be lacking; and
  3. legislation in third country may be problematic.

The EDPB stipulates that in scenarios i) and ii) the transfer in question has to be suspended, there is more flexibility in scenario iii) where the transfer may be either be suspended, supplementary measures may be implemented or continue without supplementary measures if you are able to demonstrate and document that the problematic legislation will not have any bearing on the transferred data.

Step 4- Identify and adopt supplementary measures

If as a result of Step 3, the assessment concludes that the transfer mechanism is not effective with third legislation and/ or practices, then the Recommendations urge that consideration needs to be given to whether or not supplementary measures exist that can ensure ‘essentially equivalent’ level of protection. Supplementary measures can be in a myriad of forms which include technical (controls such as encryption), organisational (procedures) and contractual and must be assessed on a case-by-case basis for the specific transfer mechanism.

N.B. A non-exhaustive list of supplementary measures include can be found in Annex 2 of the Recommendations.

Step 5- Take any formal procedural measures

A recurring theme throughout the Recommendations is the need for a nuanced approach to be adopted when assessing each specific transfer mechanism and as such, the procedural measures that will need to be taken are dependent on the specific transfer mechanism with some mechanisms requiring supervisory authority notification.

Step 6- Re-evaluate at appropriate intervals

As with all aspects of compliance, monitoring and re-evaluating of supplementary measures should be done frequently, the Recommendations do not explicitly define a time period, however factors which could impact the level of protection on transferred data such as developments in third country legislation will cause re-evaluation.

One of the main aims of the GDPR (and also one of the key principles) is that of accountability and the EDPB’s Recommendations on supplementary measures bolsters this premise. There is emphasis placed on documentation which adequately considers and records the decision-making process at each of the six steps to ensure organisations have an accurate audit trail.

In addition to the EDPB’s Recommendations, it is important for organisations (especially global ones) to take heed of any local developments in this area. With the CNIL already publishing guidance, the ICO expected to issue guidance and the Bavarian Data Protection Authority’s ruling against Mailchimp in this area, it can be said that supplementary measures will be crux of many impending data protection developments.

Olivia Wint is a seasoned data protection professional, with over five years experience in this area. Olivia has worked in a range of sectors including local authority, third sector, start-ups and the Big 4 advising on all aspects of data protection compliance.

Transgender Rights Charity Mermaids fined £25,000 by the ICO for data protection breaches

It is unfortunate at times that some charities which do the most sensitive of work also hold the most sensitive data. It makes data protection compliance all the more critical. Unfortunately, the transgender rights charity Mermaids has fallen afoul of data protection laws in the creation of a email group that was not sufficiently annexed or encrypted to protect the data it contained.

The result was that the 780 email pages were identifiable online over a period of three years. This led to the personal information of 550 people to be searchable online. Furthermore. the personal data of 24 of those people revealed how they were coping and feeling. Finally, for a further 15 classified as special category data as mental and physical health and sexual orientation were exposed.

Steve Eckersley, Director of Investigations at the ICO said:

“The very nature of Mermaids’ work should have compelled the charity to impose stringent safeguards to protect the often vulnerable people it works with. Its failure to do so subjected the very people it was trying to help to potential damage and distress and possible prejudice, harassment or abuse.

“As an established charity, Mermaids should have known the importance of keeping personal data secure and, whilst we acknowledge the important work that charities undertake, they cannot be exempt from the law.”

This serves a warning call for charities who process sensitive personal data – under the GDPR and the framework of self reporting you need to have appropriate technical measures in place. Failure to do so puts users data at risk and leaves them vulnerable. Mermaids penalty was imposed for the data being at risk for the period of 25 May 2018 to 14 June 2019.

It is notable that Mermaids data protection policies and procedures were not updated to reflect GDPR standards. Post the implementation of the Data Protection Act 2018 data protection practices are taking increasing importance and a robust review with practical changes to data harvesting, management, retention and rights handling is now a necessity.

Top 5 data breach fines since the implementation of the GDPR

Given the growing enforcement of the General Data Protection Regulation and the increased fine limits these laws impose we bring you our analysis of the 5 highest fines, along with the comments from the data protection regulators that issued them. These fines together showcase the practical implications of the new regulation and how some of the biggest companies fell foul of sanctions. Analysis is given as at 24 December 2020.

Continue reading