The Personal Data life cycle: Where to start the analysis? – Vladyslav Tamashev, Privacy lawyer at Legal IT Group

Have you ever thought about data on your computer? It doesn’t matter whether you are a content creator, programmer, or just a regular user thousands of different files were created, downloaded, and altered on your device. But what happens when some of that data becomes useless to you?

Usually, this data will be manually deleted to get some free space on your storage device or it will be wiped during the OS reinstallation. Everything that happened with that data starting from its creation or collection until its destruction is called the data life cycle.

The data life cycle is a sequence of stages that happened to a particular unit of data. The simplified life cycle model has 5 basic stages: Collection, Processing, Retention, Disclosure, Destruction. In practice, when we talk about personal data life cycle, this sequence can be dramatically different, dependant on the type of information, its usage, origin, company policies, personal data protection regulations and legislation.

Nowadays one of the most challenging quests for an IT company is to create an efficient and secure data processing inside the company, that will be compliant with local legislation and international regulations. Data life cycle optimization and adaptation is a complex task that starts with personal data life cycle analyses.

The first step in personal data life cycle analysis is to define principles of data collection and processing in the company.

These are some simple questions, that will help you:

  • What’s the purpose of users` data collection in your company? (marketing, statistics, software optimization, etc.)
  • What information is collected? (name, payment information, location, music preferences, etc.)
  • How it was collected? (directly from the user, surveillance, third parties, etc.)
  • Which categories of data are necessary and which are not? (For example email, name, and payment information – are necessary for the company; profile photo, favorite music band, phone number – are not)
  • Who will have access to that data? (top-management, outsource teams, all processes are automated, etc.)
  • Will that data have shared with the third parties? (no, contractors, processors, etc.)

In the second step, the data should be differentiated into categories and analyzed for risks associated with it. Risk analysis will help to highlight the most critical and valuable data categories. There are lots of risk determination approaches, but most of them use negative events probability and possible negative consequences in different variations.

risk = probability of a negative event X negative consequences.

For example, such factors as potential vulnerabilities, possible negative events, the intensity of negative effects and the response to negative effects may be used for more precise risk determination. Such personal data as identification and payment information are at much higher risk of dedicated hackers’ attacks than, for example, less valuable website usage statistics or users’ interface color scheme preferences.

After general analysis, each stage of the life cycle should be analyzed separately.

Collection – is the first stage of the data life cycle. Users must be informed about their data collection in a form of consent or notice. In terms of collection mechanics, data can be obtained directly (the registration form) or indirectly (surveillance, third parties).

Processing – this stage is unique for each company. It can be done manually by company employees, automatically, or with the mixed approach, which depends on the data category and the purpose of data collection. The main principles are to process as minimum information as possible to perform companies’ tasks and restrict unauthorized access.

Retention – means storage of information. The data itself should be stored no longer than necessary or defined by the data policy. For the data life cycle analysis, this stage is the key point.  Depending on the data type it can be reused, destroyed, or disclosed.

Distraction – simple data deletion is perfect for most scenarios, but when we talk about full data distraction, it means that data should be wiped out of servers, backup files, inner documentation, employees’ PCs, and any other storage devices connected to the company. That’s why data tracking should be applied inside the company.

Reuse – the most common stage of the data life cycle. Each time you log into an account or get a personalized email your data is reused by the company and altered according to your actions.

Disclosure – data sharing is important to provide good services and promote your business. Such things as advertisements, statistics, marketing, and other services are based mostly on third-party data disclosure. During the analysis, you should ensure that data transfer is compliant with legislation, the company privacy policy, and allowed by the user.

The personal data life cycle analysis is a complex process, that touches almost every aspect of the company, its data flow, business model, internal and external structure. But it`s the first step in developing a data processing system that will be resistant to external or internal threats and put users’ privacy and data security in the first place.

Vladyslav Tamashev
Privacy lawyer at
Legal IT Group

Top 10 EU and UK Data Breach fines of 2021: a selection – Suneet Sharma

This is my selection of the top 5 data breach fines in the EU and the United Kingdom in 2021, many of which have featured in our Law and Media Round Ups over the past year.

EU Fines

  1. Amazon Europe Core S.a.r.l €746,000,000

 Luxembourg’s National Commission for Data Protection issued a fine under the GDPR to Amazon Europe Core S.a.r.l. Amazon plans to appeal the penalty stating “there has been no data breach, and no customer data has been exposed to any third party… these facts are undisputed. We strongly disagree with the CNPD’s ruling.” Whilst Luxembourg’s national data protection law precludes the Commission from commenting on individual cases Amazon disclosed the fine in a filing of its quarterly results with the US Securities and Exchange Commission.

From what we can gather the fine came following a May 2018 complaint by La Quadrature du Net.  The fine is by far the biggest under the GDPR to date.

Bloomberg has the initial report. The fine attracted much coverage from the BBCPinsent Masons and the Hunton Privacy Blog.

  1. Whatsapp Ireland Ltd   €225,000,000

On 2 September 2021 the Irish Data Protection Commission announced a fine of €225,000,000 to Whatsapp. The investigation began on 10 December 2018 and it examined whether WhatsApp has discharged its GDPR transparency obligations with regard to the provision of information and the transparency of that information to both users and non-users of WhatsApp’s service. This includes information provided to data subjects about the processing of information between WhatsApp and other Facebook companies.

The case is notable due to its cross-border nature, which required data protection authorities in France, Germany and the Netherlands to consider it. The fine was considered by the European Data Protection Board, which mandated a reassessment and increase. WhatsApp disagreed with the fine, calling it “wholly disproportionate”.

The IAPPBird & Bird and Pinsent Masons have coverage of the fine.

  1. Notebooksbillinger.de  €10,400,000

The State Commissioner for Data Protection in Lower Saxony fined notebooksbilliger.de AG €10,400,000, issued in December 2020. The Commission found that the company has been using video surveillance to monitor its employees for at least two years without any legal justification. Areas recorded included workspaces, sales floors, warehouses and staff rooms.

Whilst the company argued the cameras has been installed to prevent theft it first should have tried to implement less serve means. Furthermore, the recordings were saved for 60 days which was much longer than deemed necessary.

“This is a serious case of workplace surveillance”, says the State Commissioner for Data Protection in Lower Saxony, Barbara Thiel. “Companies have to understand that such intensive video surveillance is a major violation of their employees’ rights”. While businesses often argue that video surveillance can be effectively used to deter criminals, this does not justify the permanent and unjustified interference with the personal rights of their employees. “If that were the case, companies would be able to extend their surveillance without limit. Employees do not have to sacrifice their personal rights just because their employer puts them under general suspicion”, explains Thiel. “Video surveillance is a particularly invasive encroachment on a person’s rights, because their entire behaviour can theoretically be observed and analysed. According to the case law of the Federal Labour Court, this can put staff under pressure to act as inconspicuously as possible to avoid being criticised or sanctioned for their behaviour”.

Data Privacy ManagerData GuidanceSimmons & Simmons and Luther have commentary.

  1. Austrian Post  €9,500,000

The Austrian Data Protection Authority issued a fine of €9,500,000 to the Austrian Post alleging that it had not enabled data protection enquiries via email.

In October 2019 the Post received a €18,000,000 fine for processing personal data on the alleged political affinity of affected data subjects. The fine was later annulled in a November 2020 court decision. The Post has announced it plans to appeal this second penalty. “The allegations made by the Authority mainly relate to the fact that, in addition to the contact opportunities made available by Austrian Post via mail, a web contact form and the company’s customer service centre, inquiries about personal data must also be made possible via e-mail. Austrian Post also intends to launch an appeal against this decision.”

See coverage from Data Guidance.

  1. Vodaphone Espana   €8,150,000

From April 2018 to September 2019, 191 complaints were received for similar cases concerning telephone calls and SMS messages to citizens who had opposed the processing of their data for advertising. The failure of Vodapone to avoid advertising actions to those citizens who had exercised their rights of opposition or erasure of their data justified a fine.

Coverage was broad with Compliance WeekData Guidance and Stephenson Harwood commenting.

United Kingdom Fines

UK fines- the ICO has issued 35 monetary penalty notices thus far in 2021. Below we take a look at a selection of the fines.

  1. Clearview AI  £17 million

The Information Commissioner’s Office (“ICO”) has issued a provisional view of the imposition of a £17m fine over Clearview AI..  The BBC cites that the firms’ database has over 10bn images. The ICO has issued a provisional notice to stop further processing of the personal data of people in the UK and to delete any such data following alleged serious breaches of the UK’s data protection laws.

In a joint investigation with the Australian Information Commissioner (“AIC”) the ICO concluded that the data, some scraped from the internet, was being processed, in the case of UK persons, unlawfully in some instances.

Clearview AI Inc’s services were being used on a free trial basis by some law enforcement agencies. This has been confirmed to no longer be the case.

The ICO’s preliminary view is that Clearview AI Inc appears to have failed to comply with UK data protection laws in several ways including by:

  • failing to process the information of people in the UK in a way they are likely to expect or that is fair;
  • failing to have a process in place to stop the data being retained indefinitely;
  • failing to have a lawful reason for collecting the information;
  • failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR);
  • failing to inform people in the UK about what is happening to their data; and
  • asking for additional personal information, including photos, which may have acted as a disincentive to individuals who wish to object to their data being processed.

Information Commissioner Elizabeth Denham commented:

“I have significant concerns that personal data was processed in a way that nobody in the UK will have expected. It is therefore only right that the ICO alerts people to the scale of this potential breach and the proposed action we’re taking. UK data protection legislation does not stop the effective use of technology to fight crime, but to enjoy public trust and confidence in their products technology providers must ensure people’s legal protections are respected and complied with.

Clearview AI Inc’s services are no longer being offered in the UK. However, the evidence we’ve gathered and analysed suggests Clearview AI Inc were and may be continuing to process significant volumes of UK people’s information without their knowledge. We therefore want to assure the UK public that we are considering these alleged breaches and taking them very seriously.”

 The ICO press release can be found here and the AIC press release here.

The previous statement of the ICO on the conclusion of the joint investigation can be found here.

  1. Cabinet Office  £500,000

The Cabinet Office was fined £500,000 on 2 December 2021 for disclosing the postal addresses of the 2020 New Years honours recipients online. In finding that the Cabinet Office failed to put appropriate technical and organisation measures in place the ICO noted that the data was accessed 3,872 times.

The ICO received three complaints from affected individuals who raise personal safety concerns and 27 contacts from individuals citing similar concerns. Steve Eckersley, ICO Director of Investigations, said:

“When data breaches happen, they have real life consequences. In this case, more than 1,000 people were affected. At a time when they should have been celebrating and enjoying the announcement of their honour, they were faced with the distress of their personal details being exposed.

“The Cabinet Office’s complacency and failure to mitigate the risk of a data breach meant that hundreds of people were potentially exposed to the risk of identity fraud and threats to their personal safety.

 “The fine issued today sends a message to other organisations that looking after people’s information safely, as well as regularly checking that appropriate measures are in place, must be at the top of their agenda.”

The Guardian reports on the data breach as does Data Guidance.

  1. EB Associates Group Limited  £140,000

The ICO issued its largest fine to date to EB Associates Group Limited for instigating over 107,000 illegal cold calls to people about pensions. The practice has been banned since 2019.

Andy Curry, Head of ICO Investigations, said:

“Our priority is to protect people and we will always take robust action against companies operating illegally for their own financial gain.

“Cold calls about pensions were banned to protect people from scammers trying to cheat them out of their retirement plans.

“We encourage anyone who receives an unexpected call about their pension to hang up and then report it to us.”

The fine was covered by professional pensions.

  1. Mermaids  £25,000

It is unfortunate at times that some charities which do the most sensitive of work also hold the most sensitive data. It makes data protection compliance all the more critical. Unfortunately, the transgender rights charity Mermaids fell afoul of data protection laws in the creation of an email group that was not sufficiently annexed or encrypted to protect the data it contained.

The result was that the 780 email pages were identifiable online over a period of three years. This led to the personal information of 550 people to be searchable online. Furthermore. the personal data of 24 of those people revealed how they were coping and feeling. Finally, for a further 15 classified as special category data as mental and physical health and sexual orientation were exposed.

Steve Eckersley, Director of Investigations at the ICO said:

“The very nature of Mermaids’ work should have compelled the charity to impose stringent safeguards to protect the often-vulnerable people it works with. Its failure to do so subjected the very people it was trying to help to potential damage and distress and possible prejudice, harassment or abuse.

 “As an established charity, Mermaids should have known the importance of keeping personal data secure and, whilst we acknowledge the important work that charities undertake, they cannot be exempt from the law.”

This serves a warning call for charities who process sensitive personal data – under the GDPR and the framework of self-reporting you need to have appropriate technical measures in place. Failure to do so puts users’ data at risk and leaves them vulnerable. Mermaids’ penalty was imposed for the data being at risk for the period of 25 May 2018 to 14 June 2019.

It is notable that Mermaid’s data protection policies and procedures were not updated to reflect GDPR standards. Post the implementation of the Data Protection Act 2018 data protection practices are taking increasing importance and a robust review with practical changes to data harvesting, management, retention and rights handling is now a necessity.

DAC Beachcroft comments as does Slaughter and Maythe Independent and EM Law.

  1. HIV Scotland  £10,000

In a cautionary tale for those using bulk email practices HIV Scotland was fined £10,000 for sending an email to 105 people which included patient advocates representing people living in Scotland with HIV. All the email addresses were visible to all recipients, and 65 of the addresses identified people by name.

From the personal data disclosed, an assumption could be made about individuals’ HIV status or risk. The ICO’s investigation found inadequate staff training, incorrect methods of sending bulk emails by blind carbon copy and an inadequate data protection policy.

Ken Macdonald, Head of ICO Regions, said:

“All personal data is important but the very nature of HIV Scotland’s work should have compelled it to take particular care. This avoidable error caused distress to the very people the charity seeks to help.

 “I would encourage all organisations to revisit their bulk email policies to ensure they have robust procedures in place.”

The BBCKeller Lenker and the Times have coverage.  

Suneet Sharma is a junior legal professional with a particular interest and experience in media, information and privacy law.  He is the editor of The Privacy Perspective blog.

Quotes from caselaw 5: Lloyd v Google LLC [2021] UKSC 50 – no one size fits all claim available in data protection “Safari Workaround” class action

In one of the most significant privacy law judgments of the year the UK Supreme Court considered whether a class action for breach of s4(4) Data Protection Act 1998 (“DPA”) could be brought against Google of its obligations as a data controller for its application of the “Safari Workaround”. The claim for compensation was made under s.13 DPA 1998.

The amount claimed per person advanced in the letter of claim was £750. Collectively, with the number of people impacted by the processing, the potential liability of Google was estimated to exceed £3bn.

“The claim alleges that, for several months in late 2011 and early 2012,
Google secretly tracked the internet activity of millions of Apple iPhone users and used the data collected in this way for commercial purposes without the users’ knowledge or consent.”

Lord Leggatt at p.1

The class action claim was brought under rule 19.6 of the Civil Procedure Rules.

Lord Leggatt handed down the unanimous judgement in favour of the appellant Google LLC:

“the claim has no real prospect of
success. That in turn is because, in the way the claim has been framed in order to try to bring it as a representative action, the claimant seeks damages under section 13 of the DPA 1998 for each individual member of the represented class without attempting to show that any wrongful use was made by Google of personal data relating to that
individual or that the individual suffered any material damage or distress as a result of a breach of the requirements of the Act by Google.”

At p.159

It should be noted that the claim was brought under the Data Protection Act 1998 and not under the GDPR.

See the full judgement here. The Panopticon Blog has an excellent summary.

ICO issues provisional view to fine Clearview AI Inc over £17 million

The Information Commissioner’s Office (“ICO”) has issued a provisional view of the imposition of a £17m fine over Clearview AI.

The BBC cites that the firms’ database has over 10bn images. The ICO has issued a provisional notice to stop further processing of the personal data of people in the UK and to delete any such data following alleged serious breaches of the UK’s data protection laws.

In a joint investigation with the Australian Information Commissioner (“AIC”) the ICO concluded that the data, some scraped from the internet, was being processed, in the case of UK persons, unlawfully in some instances.

Clearview AI Inc’s services were being used on a free trial basis by some law enforcement agencies. This has been confirmed to no longer be the case.

The ICO’s preliminary view is that Clearview AI Inc appears to have failed to comply with UK data protection laws in several ways including by:

  • failing to process the information of people in the UK in a way they are likely to expect or that is fair;
  • failing to have a process in place to stop the data being retained indefinitely;
  • failing to have a lawful reason for collecting the information;
  • failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and UK GDPR);
  • failing to inform people in the UK about what is happening to their data; and
  • asking for additional personal information, including photos, which may have acted as a disincentive to individuals who wish to object to their data being processed.

Information Comissioner Elizabeth Denham commented:

“I have significant concerns that personal data was processed in a way that nobody in the UK will have expected. It is therefore only right that the ICO alerts people to the scale of this potential breach and the proposed action we’re taking. UK data protection legislation does not stop the effective use of technology to fight crime, but to enjoy public trust and confidence in their products technology providers must ensure people’s legal protections are respected and complied with.

Clearview AI Inc’s services are no longer being offered in the UK. However, the evidence we’ve gathered and analysed suggests Clearview AI Inc were and may be continuing to process significant volumes of UK people’s information without their knowledge. We therefore want to assure the UK public that we are considering these alleged breaches and taking them very seriously.”

This is one of the largest fines issued under the GDPR to date. Clearview now has the opportunity to respond, both in the UK and Australia (the AIC has found breaches of Australian privacy laws).

It’s unsurprising that its database, said to have included images scraped from social media, has drawn the attention of regulators. Facial recognition services have been at the forefront of recent data analytics scrutiny and data protection enforceability.

The ICO press release can be found here and the AIC press release here.

The previous statement of the ICO on the conclusion of the joint investigation can be found here.

Citation: BBC: WhatsApp changes privacy policy after Irish data protection authority issues £190m fine

The BBC has an insightful article on WhatsApp’s behaviour after the sanctions imposed on it by the Irish Data Protection Authority fined it £190m in September 2021.

According to the BBC, the tweaks are designed to “add additional detail around [WhatsApps] existing practices”, and will only appear in the European version of the privacy policy, which is already different from the version that applies in the rest of the world.

“There are no changes to our processes or contractual agreements with users, and users will not be required to agree to anything or to take any action in order to continue using WhatsApp,” the company said, announcing the change.

WhatsApp is appealing the fine imposed against it by the Irish Data Protection Commissioner.

ICO launches consultation on the Draft Journalism Code of Practice

The ICO’s consultation on its Draft Journalism Code of Practice has begun.

Be sure to have your say- the deadline to submit responses is 22 January 2022.

The Code covers privacy safeguards among many other topics. In particular, it covers the journalism exemption under the Data Protection Act 2018 and its broad exemption that disapplies requirements to holding and processing data.

Journalism should be balanced with other rights that are also
fundamentally important to democracy, such as data protection and the
right to privacy.

at p.4

The Code substantively addresses the safeguarding of journalism under the exemption, briefly touching on balancing a free press against privacy rights before going on to discuss how this balance is struck under data protection laws:

Why is it important to balance journalism and privacy?


It is widely accepted that a free press, especially a diverse press, is a
fundamental component of a democracy.

It is associated with strong and
important public benefits worthy of special protection. This in itself is a public
interest.

Most obviously, a free press plays a vital role in the free flow of

communications in a democracy. It increases knowledge, informs debates
and helps citizens to participate more fully in society. All forms of journalistic
content can perform this crucial role, from day-to-day stories about local
events to celebrity gossip to major public interest investigations.

A free press is also regarded as a public watch-dog. It acts as an important
check on political and other forms of power, and in particular abuses of
power. In this way, it helps citizens to hold the powerful to account.

However, the right to freedom of expression and information should be
balanced with other rights that are necessary in a democratic society, such
as the right to privacy. The public interest in individual freedom of expression
is itself an aspect of a broader public interest in the autonomy, integrity and
dignity of individuals.

The influence and power of the press in society, and the reach of the

internet, means that it is particularly important to balance journalism and
people’s right to privacy.

This code provides guidance about balancing these two important rights by
helping you to understand what data protection law requires and how to
comply with these requirements effectively.

at p.25

Quotes from caselaw 3: Fairhurst v Woodard (Case No: G00MK161) – A cautionary tale for neigbours implementing surveillance

I am satisfied that the
extent of range to which these devices can capture audio is well beyond the
range of video that they capture, and in my view cannot be said to be
reasonable for the purpose for which the devices are used by the Defendant,
since the legitimate aim for which they are said to be used, namely crime
prevention, could surely be achieved by something less. A great deal of the
purpose could be achieved without audio at all, as is the case with the bulk
of CCTV systems in use in public places in this country, or by a microphone that only picks up sound within a small diameter of the device.


That finding means that I am satisfied that the processing of such audio
data by the Defendant as data controller is not lawful. The extent of the
range means that personal data may be captured from people who are not
even aware that the device is there, or that it records and processes audio
personal data, or that it can do so from such a distance away, in breach of
the first principle.”

Melissa Clarke HHJ. at p.137

In Fairhurst a neighbour complained that use of several cameras, including a Ring doorbell, amounted to nusiance, harassment and breach of the Data Protection Act 2018.

The claims of harassment and data protection succeeded. It was, in particular, noted that the audio recording capabilities of the devices were much broader in than the video recording capability. As the above quote shows, the extent processing of the audio recording data was such that it was unlawful under data protection laws.

The audio recording capability of the Ring device extended 40-68ft (12-20m).

Amazon released a statement following the finding in the case: “We strongly encourage our customers to respect their neighbours’ privacy and comply with any applicable laws when using their Ring product.”

The case serves as a cautionary tale for those seeking to implement surveillance around their homes that impinge upon their neighbours.

INFORRM has an excellent case comment for interested readers. As does the Guardian.

Healthcare data and data protection in the time of coronavirus – Olivia Wint

The processing of special category personal data (including health data e.g. vaccination status, blood type, health conditions etc) was a common topic before the COVID-19 pandemic (the “pandemic”), with various resources published that explored this topic.

For example, the European Data Protection Board (“EDPB”) published an adopted opinion on the interplay between the Clinical Trials Regulation and the General Data Protection Regulation* (“GDPR”) (23January 2019), the Information Commissioner’s Office (“ICO”) posted a blog on why special category personal data needs to be handled even more carefully (14 November 2019) and the ICO published guidance on the lawful basis for processing special category data compliance with the GDPR (November 2019).

The pandemic has brought about a number of data protection considerations, all of which were already in existence but exacerbated by the pandemic (employee monitoring, contact tracing, workforce shift from office to home etc.) One that is more prevalent than ever before is the processing of health data, this piece aims to cover some key data protection themes and practical insights into the processing of health data.  

Health data, a subset of special category personal data by its very nature comes with an increased risk profile.  When processing this data type, not only are there legislative data protection requirements, the expectation of good clinical governance practices but also regulatory body considerations too.                                                           

For example, the NHS Care Quality Commission have in place a code of practice on confidential personal information, the NHS Health Research Authority have in place GDPR guidance specifically for researchers and study coordinators and technical guidance for those responsible for information governance within their organisation and the NHS more generally, has in place it’s Data Security and Protection Toolkit (the “Toolkit”). The Toolkit is an online self-assessment tool that enables organisations to measure and publish their performance against the National Data Guardian’s ten data security standards. The Toolkit covers records management and retention, training and awareness, system vulnerability management and crisis management to name a few.                                                                  

The above is all on a national level (UK), on an international level, there are data protection laws which specifically cover health data such as HIPAA in the US, the Patient Data Protection Act in Germany, and various provincial health data privacy laws in Canada such as the Health Information Act in Alberta.

Whilst the previous paragraph highlights the complexities of processing health data whether on a national and international level in comparison to other data types, there are a number of mitigations that organisations can put in place to adequately reduce the risks associated with processing this type of data. Mitigations such as Data Protection Impact Assessments (“DPIAs”), updated privacy notices and appropriate security measures amongst other things should all be considered.

Many organisations that never historically processed health data may now do so as a result of the pandemic…

Covering your bases

The first base that must be covered when processing data is ensuring that an appropriate legal basis has been established for each data processing activity, so for example if health data is processed for employee monitoring and research, a legal basis for both of these activities will need to be established. Legal bases can include for the performance of a contract, for legitimate interests** of the organisation and/or in order to perform a legal obligation.  Where processing of health data is concerned an additional category under Article 9 of the UK GDPR must be met. In the healthcare context, applicable additional categories may include explicit consent, health or social care purposes, public health purposes and/or archiving research and statistical purposes. 

Many organisations that never historically processed health data may now do as a result of the pandemic or alternatively organisations that processed health data pre-pandemic may now be doing so in larger amounts, organisations that fit either side of the coin should also assess the extent to which their privacy notice(s) have been updated and/or need to be updated in order to make data subjects aware any applicable data processing changes and to comply with transparency obligations.

Next, large scale processing of health data may pose a ‘high risk to the rights and freedoms of natural persons’ and in such cases, will trigger the requirement of a DPIA. In order for a DPIA to have value, it is important for organisations to ensure that the DPIA is assessed and considered early on to ensure privacy by design and default is incumbent of any system or processing activity.

A DPIA will assess the likelihood and severity of harm related to the processing activity in question and should the DPIA identify a high risk with no available mitigations, consultation with the ICO will be needed. The ICO has set out a 9-step lifecycle for the DPIA, all of which should be considered before any data processing has taken place:

  1. Identify a need for a DPIA;
  2. Describe the processing;
  3. Consider consultation;
  4. Assess necessity and proportionality;
  5. Identify and assess risks;
  6. Identify measures to mitigate risk;
  7. Sign off and record outcomes;
  8. Integrate outcomes into plan; and
  9. Keep under review.

Internally, organisations should have appropriate technical and organisational measures in place which reflects the risk presented. In relation to technical measures, appropriate internal controls and security measures should be utilised. Organisations may wish to consider a myriad and combination of controls to ensure that health data has the best level of protection, this may include end to end encryption for data both in transit and at rest, role-based access within organisations and the adoption and accreditation of industry recognised security standards such as ISO 27001.

In respect of organisational measures, it may be apt for training and awareness sessions to be implemented with tailored training administered to employees that will doing data processing activities and a robust policy suite in place which covers key circumstances such as data breaches and business continuity.

Data sharing

A specific data processing activity that may be utilised more in the wake of the pandemic is that of data sharing between organisations for information and research purposes. In the England, the soon to be implemented GP Data Sharing Scheme aims to improve and create a new framework for creating a central NHS digital database from GP records and the UK’s Department of Health and Social Care (“DHSC”) has recently published a draft policy paper titled ‘Data saves lives: reshaping health and social care with data’. The policy covers the aspiration of the DHSC to introduce new legislation as part of the Health and Care Bill (currently at Committee stage) to encourage data sharing between private health providers and the NHS and have more guard rails around the sharing of data generally through mandating standards for how data is collected and stored.

With data sharing as evidenced by the above, is something that will be advocated for and welcomed in due course, it is important that organisations have in place the appropriate contractual and practical measures to protect data as data in motion is when it is most vulnerable. Contractual measures include ensuring data sharing and/or transfer agreements are in place which cover all necessary contractual provisions and provide adequate assurances as to the data sharing/transfer arrangements. The NHSX has published a template Data Sharing Agreement which has been labelled as suitable for use by all health and care organisations and includes risk management, legal basis and confidentiality and privacy provisions amongst other things. Practical measures include conducting due diligence checks on all organisations which may be in receipt of data as part of the data sharing process (including third parties) and anonymising/ pseudonymising data. The ICO has put in place a comprehensive data sharing checklist which invites organisations to consider data minimisation, accountability and data subject rights.

The pandemic has changed the world that we knew it in more ways than one and in the context of processing of health data, what seems to be certain is that the processing of health data is on the rise. As such, organisations should continue to monitor guidance and developments in this area and ensure data protection principles are at the core of all data processing activities as a first port of call.

* EDPB guidelines are no longer directly relevant to the UK data protection regime and are not binding under the UK regime.

** A legitimate interest assessment should be considered when relying on legitimate interest as a lawful basis.

Olivia Wint is a seasoned data protection professional, with over five years experience in this area. Olivia has worked in a range of sectors including local authority, third sector, start-ups and the Big 4 advising on all aspects of data protection compliance.

A look at the European Data Protection Board guidance on supplementary measures – Olivia Wint

Data transfers have been a prominent topic in the data protection world in recent months, with the UK’s recent adequacy decision adding to the conversation on the topic.

On 21 June 2021, the European Data Protection Board (“EDPB”) published the final version of Recommendations on supplementary measures (the “Recommendations”). For context, the first draft Recommendations which were published in November 2020 were prompted as a result of the much-anticipated Schrems II judgment which was handed down in July 2020.

The Schrems II judgment comes after the Schrems I judgment, which in 2015, invalidated the Safe Harbour regime in 2015. The focal point of the Schrems II case concerned the legitimacy of standard contractual clauses (“SCCs”) as a transfer mechanism in respect of cross border data transfers from the EU to the US. Max Schrems, a privacy advocate argued that Facebook Ireland transferring a significant amount of data to the US was not adequate due to the US’ surveillance programmes. Schrems argued that this fundament tally affected his right to ‘privacy, data protection and effective judicial protection’.  Rather unexpectedly, the Court of Justice in the European Union (“CJEU”) declared the invalidity of the privacy shield in this case and whilst SCCs were not invalidated, the CJEU laid down stricter requirements for cross border transfers relying on SCCs, which included additional measures to ensure that cross border transfers have ‘essentially equivalent’ protection to that of the General Data Protection Regulation 2016/ 679 (“GDPR”).

As a result of the Schrems II judgment and the invalidation of the privacy shield, the estimated 5300 signatories to this mechanism now need to seek alternate transfer mechanisms and companies on a transatlantic scale have been forced to re-examine their cross-border transfers. As such EDPB’s Recommendations could not have come sooner for many in the privacy world. 

Based on the Schrems II judgment, supplementary measures are in essence additional safeguards to any of the existing transfer mechanisms as cited in Article 46 GDPR, which include SCCs, binding corporate rules (“BCRs”) and approved code of conducts to name a few with the overarching objective of the supplementary measures to ensure the ‘essentially equivalent’ threshold is met.

The EDPB’s Recommendations, outline six key steps which comprise part of an assessment when deducing the need for supplementary measures:

  1. know your transfers;
  2. identify the transfer mechanism(s) you are relying on;
  3. assess whether the transfer mechanism you are relying on is effective in light of all circumstances of the transfer);
  4. identify and adopt supplementary measures;
  5. take any formal procedural measures; and
  6. re-evaluate at appropriate intervals.

Step 1- know your transfers

Step 1 concerns organisations having a good grasp on their data processing activities, mainly evidenced through data mapping and/or records of processing activities (“ROPAs”). As ROPAs are a direct obligation under the GDPR, in theory for most organisations it will be a case of ensuring that the ROPA accurately reflects any new data processing that has occurred (with the inclusion of any third parties).

Key data protection principles should also be considered for example, lawfulness, fairness and transparency (does the privacy policy make it clear that cross border transfers are taking place?), data minimisation (is the whole data set being transferred or just what is relevant?) and accuracy (have data quality checks been conducted on the data in question?).

The Recommendations stipulate that these activities should be executed before any cross-border transfers are made and highlights the fact that cloud storage access is also deemed to be a transfer too.

Step 2- identify the transfer mechanism(s) you are relying on

There are a number of transfer mechanisms that can be relied on for cross border data transfers, such as SCCs, BCRs, codes of conduct etc and adequacy decisions and this step requires organisations to identify the mechanism that will be used for the transfer.

EDPB has noted for organisations that will be using the adequacy decision as their desired transfer mechanism, the subsequent steps in the Recommendations can be discarded.

N.B. to date, the European Commission has only recognised Andorra, Argentina, Canada (commercial organisations only), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Switzerland, Uruguay and the UK.

Step 3- Assess whether the transfer mechanism you are relying on is effective in light of all circumstances of the transfer

This is a critical part of the assessment and requires organisations to assess/ examine the third country’s legislation and practices to ascertain the extent to which there are limitations which may mean the protection afforded as a result of the cross-border transfer is less than ‘essentially equivalent’. The Recommendations affirm that the scope of the assessment needs to be limited to the legislation and practices relevant to the protection of the specific data you transfer. The legislation and/or practices examined must be publicly available in the first instance, verifiable and reliable.

Key circumstances which may influence the applicable legislation/ and or practices include (but are not limited to):

  • purpose for data transfer (marketing, clinical research etc);
  • sector in which transfer occurs (financial, healthcare etc);
  • categories of personal data transferred (children’s data, health data etc); and
  • format of the data (raw, pseudonymised, anonymised, encrypted at rest and in transit etc).

The assessment should be holistic in nature and cover all relevant parties such as controllers, processors and sub- processors (as identified in Step 1) and should consider the effectiveness of data subject rights in practice.

Examining of legislation and practices is of utmost important in situations when:

  1. legislation in third country does not formally meet EU standards in respect of rights/freedoms and necessity and proportionality;
  2. legislation in third country may be lacking; and
  3. legislation in third country may be problematic.

The EDPB stipulates that in scenarios i) and ii) the transfer in question has to be suspended, there is more flexibility in scenario iii) where the transfer may be either be suspended, supplementary measures may be implemented or continue without supplementary measures if you are able to demonstrate and document that the problematic legislation will not have any bearing on the transferred data.

Step 4- Identify and adopt supplementary measures

If as a result of Step 3, the assessment concludes that the transfer mechanism is not effective with third legislation and/ or practices, then the Recommendations urge that consideration needs to be given to whether or not supplementary measures exist that can ensure ‘essentially equivalent’ level of protection. Supplementary measures can be in a myriad of forms which include technical (controls such as encryption), organisational (procedures) and contractual and must be assessed on a case-by-case basis for the specific transfer mechanism.

N.B. A non-exhaustive list of supplementary measures include can be found in Annex 2 of the Recommendations.

Step 5- Take any formal procedural measures

A recurring theme throughout the Recommendations is the need for a nuanced approach to be adopted when assessing each specific transfer mechanism and as such, the procedural measures that will need to be taken are dependent on the specific transfer mechanism with some mechanisms requiring supervisory authority notification.

Step 6- Re-evaluate at appropriate intervals

As with all aspects of compliance, monitoring and re-evaluating of supplementary measures should be done frequently, the Recommendations do not explicitly define a time period, however factors which could impact the level of protection on transferred data such as developments in third country legislation will cause re-evaluation.

One of the main aims of the GDPR (and also one of the key principles) is that of accountability and the EDPB’s Recommendations on supplementary measures bolsters this premise. There is emphasis placed on documentation which adequately considers and records the decision-making process at each of the six steps to ensure organisations have an accurate audit trail.

In addition to the EDPB’s Recommendations, it is important for organisations (especially global ones) to take heed of any local developments in this area. With the CNIL already publishing guidance, the ICO expected to issue guidance and the Bavarian Data Protection Authority’s ruling against Mailchimp in this area, it can be said that supplementary measures will be crux of many impending data protection developments.

Olivia Wint is a seasoned data protection professional, with over five years experience in this area. Olivia has worked in a range of sectors including local authority, third sector, start-ups and the Big 4 advising on all aspects of data protection compliance.