The Personal Data life cycle: Where to start the analysis? – Vladyslav Tamashev, Privacy lawyer at Legal IT Group

Have you ever thought about data on your computer? It doesn’t matter whether you are a content creator, programmer, or just a regular user thousands of different files were created, downloaded, and altered on your device. But what happens when some of that data becomes useless to you?

Usually, this data will be manually deleted to get some free space on your storage device or it will be wiped during the OS reinstallation. Everything that happened with that data starting from its creation or collection until its destruction is called the data life cycle.

The data life cycle is a sequence of stages that happened to a particular unit of data. The simplified life cycle model has 5 basic stages: Collection, Processing, Retention, Disclosure, Destruction. In practice, when we talk about personal data life cycle, this sequence can be dramatically different, dependant on the type of information, its usage, origin, company policies, personal data protection regulations and legislation.

Nowadays one of the most challenging quests for an IT company is to create an efficient and secure data processing inside the company, that will be compliant with local legislation and international regulations. Data life cycle optimization and adaptation is a complex task that starts with personal data life cycle analyses.

The first step in personal data life cycle analysis is to define principles of data collection and processing in the company.

These are some simple questions, that will help you:

  • What’s the purpose of users` data collection in your company? (marketing, statistics, software optimization, etc.)
  • What information is collected? (name, payment information, location, music preferences, etc.)
  • How it was collected? (directly from the user, surveillance, third parties, etc.)
  • Which categories of data are necessary and which are not? (For example email, name, and payment information – are necessary for the company; profile photo, favorite music band, phone number – are not)
  • Who will have access to that data? (top-management, outsource teams, all processes are automated, etc.)
  • Will that data have shared with the third parties? (no, contractors, processors, etc.)

In the second step, the data should be differentiated into categories and analyzed for risks associated with it. Risk analysis will help to highlight the most critical and valuable data categories. There are lots of risk determination approaches, but most of them use negative events probability and possible negative consequences in different variations.

risk = probability of a negative event X negative consequences.

For example, such factors as potential vulnerabilities, possible negative events, the intensity of negative effects and the response to negative effects may be used for more precise risk determination. Such personal data as identification and payment information are at much higher risk of dedicated hackers’ attacks than, for example, less valuable website usage statistics or users’ interface color scheme preferences.

After general analysis, each stage of the life cycle should be analyzed separately.

Collection – is the first stage of the data life cycle. Users must be informed about their data collection in a form of consent or notice. In terms of collection mechanics, data can be obtained directly (the registration form) or indirectly (surveillance, third parties).

Processing – this stage is unique for each company. It can be done manually by company employees, automatically, or with the mixed approach, which depends on the data category and the purpose of data collection. The main principles are to process as minimum information as possible to perform companies’ tasks and restrict unauthorized access.

Retention – means storage of information. The data itself should be stored no longer than necessary or defined by the data policy. For the data life cycle analysis, this stage is the key point.  Depending on the data type it can be reused, destroyed, or disclosed.

Distraction – simple data deletion is perfect for most scenarios, but when we talk about full data distraction, it means that data should be wiped out of servers, backup files, inner documentation, employees’ PCs, and any other storage devices connected to the company. That’s why data tracking should be applied inside the company.

Reuse – the most common stage of the data life cycle. Each time you log into an account or get a personalized email your data is reused by the company and altered according to your actions.

Disclosure – data sharing is important to provide good services and promote your business. Such things as advertisements, statistics, marketing, and other services are based mostly on third-party data disclosure. During the analysis, you should ensure that data transfer is compliant with legislation, the company privacy policy, and allowed by the user.

The personal data life cycle analysis is a complex process, that touches almost every aspect of the company, its data flow, business model, internal and external structure. But it`s the first step in developing a data processing system that will be resistant to external or internal threats and put users’ privacy and data security in the first place.

Vladyslav Tamashev
Privacy lawyer at
Legal IT Group

Privacy Law Monthly Round Up – September 2021

Headlines

Ben and Deborah Stokes’ privacy claim against The Sun for the highly intrusive article detailing traumatic events in the Stokes’ family past was settled on 30 August 2021, with the newspaper agreeing to publish an apology and pay substantial damages. Paul Wragg wrote about The Sun’s “nonsensical” defence for the Inforrm Blog, concluding that the only party spared the anguish of trial was the newspapers’ defence team.

Government and General legislative developments

The controversial Police, Crime, Sentencing and Courts Bill had its second reading in the House of Lords this month. The Bill is notorious for its proposed restrictions on peaceful protest, which critics have predicted will have a discriminatory impact and breach the rights to freedom of expression and assembly. Broadened police powers would also enable the extraction of more information from mobile phones.

The Age Appropriate Design Code (aka the “Children’s Code”) entered into force on 2 September 2021 following a one year transition period. The Children’s Code explains to businesses how the UK GDPR, Data Protection Act and Privacy and Electronic Communications Regulations apply to the design and delivery of Information Society Services (“ISS”) – i.e social media, educational and gaming platforms – apply to children. The Children’s Code is the first of its kind worldwide, and has been welcomed by many as a positive development for keeping children safe online. The 15 standards that the Code sets can be found here.

Sticking with child safety online, Home Secretary Priti Patel launched a Safety Tech Challenge fund at the G7 meeting start of this month. Five applicants will be awarded up to £85,000 each to develop new technologies that enable to detection of child sexual abuse material online, without breaking end-to-end encryption.

The UK Government has launched a public consultation on data protection legislation reform following Brexit entitled Data: A new direction. The consultation is open until 19 November. Following the end of the Brexit transition period, the UK’s data protection regime, which had derived from the EU framework, will be transposed into domestic law known as the UK GDPR. The Government is seeking to use this opportunity to make some changes to the current regime. The Hawtalk Blog discusses how some of these proposals are unethical and unsafe. Further discussion can be found on the Panopticon Blog and the Data Protection report

Data Privacy and Data Protection

Cressida Dick, the Metropolitan Police Commissioner, has accused tech giants of undermining terrorist prevention efforts by virtue of their focus on end-to-end encryption. Writing in The Telegraph on the twentieth anniversary of the 9/11 attacks, she said that it is “impossible in some cases” for the police to fulfil their role to protect the public. Given the pressure on tech giants to ensure users’ privacy, companies are unlikely to reshape their platforms to facilitate more extensive monitoring.

Apple has delayed its plan to scan its users’ iCloud images for child sexual abuse material. The proposed detection technology would compare images before they are uploaded to iCloud against unique “digital fingerprints” of known child pornographic material maintained by the National Centre for Missing and Exploited Children. The plan was criticised by privacy groups because it involved using an individual’s own device to check if they were potentially engaged in criminal activity.

Surveillance

The Metropolitan Police have invested £3 million into new facial recognition technologies (FRT) that will greatly increase surveillance capabilities in the capital. The expansion of the Met’s technology will enable it to process historic images from CCTV feeds, social media and other sources in order to track down suspects. Critics argue that such FRT encroaches on privacy by “turning back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years.” There is also concern that FRT can exacerbate existing racial discrimination in the criminal justice system. The UK’s Surveillance Camera Commissioner (SCC), Professor Fraser Sampson, has acknowledged that some FRT “are so ethically fraught” that it may only be appropriate to carry them out under license in the future.

NGO’s

Big Brother Watch published an opinion piece warning that the imposition of vaccine passports could reorganise Britain into a two-tier, checkpoint society. The article responds to the Scottish Parliament’s vote in favour of vaccine passports earlier this month. Wales has since followed Scotland and announced mandatory vaccination and COVID status check schemes. The English government has not yet committed to such a regime. The ICO has emphasised that data protection laws will not stand in the way of mandatory vaccination and COVID status checks, but rather facilitate responsible sharing of personal data where it is necessary to protect public health. 

Privacy International has considered how data-intensive systems and surveillance infrastructure, developed by national and foreign actors, in Afghanistan as part of developmental and counter-terrorism measures will fare under the Taliban regime.

From the regulator

ICO

The ICO has announced two fines this month;

  • A total of £495,000 was imposed against We Buy Any Car, Saga, and Sports Direct for sending more than 354 million “frustrating and intrusive” nuisance messages between them. None of the companies had permission to send recipients marketing emails or texts, making their behaviour illegal;
  • The Glasgow-based company DialADeal Scotland Ltd was fined £150,000 for the making of more than 500,000 nuisance marketing calls to recipients who had not given their permission to receive them.

The ICO has also released a communiqué from a meeting on data protection and privacy held by the G7 authorities at the start of the month. The meeting is closely aligned with the Roadmap for Cooperation on Data Free Flow with Trust announced by G7 Digital and Technology Ministers on 28 April 2021.

IPSO

IPSO has published a number of privacy rulings and resolutions;

IMPRESS

There were no IMPRESS rulings relating to privacy this month.

Cases

The Inforrm Blog has published an article detailing the continued decline in privacy injunction applications in England and Wales for 2021. There were only three applications in the first six months of the year, down from ten in 2020. All three applications were successful. Only 4% of the new issued cases on the Media and Communications List related to misuse of private information or breach of privacy.

No judgements relating to privacy have been handed down this month.


Written by Colette Allen

Colette Allen has hosted “Newscast’” on The Media Law Podcast with Dr Thomas Bennett and Professor Paul Wragg since 2018. She has recently finished the BTC at The Inns of Court College of Advocacy and will be starting a MSc in the Social Sciences of the Internet at the University of Oxford in October 2021.

Citation: Wired: The privacy of crypto-currency

Wired has published an insightful article on virtual currencies.

The article considers the privacy implications of crypto-currency transactions. It highlights the issues surrounding logging each transaction in a publically available manner and concerns around behavioural modelling.

The article considers the providers Monero and Zcash in particular.

Google and healthcare provider Ascension collaboration raises privacy concerns

blue and silver stetoscopeGoogle Cloud has been providing Ascension, the second biggest healthcare provider in the US, with cloud infrastructure services since July 2019. Providing software services to healthcare providers to facilitate the secure management of patient data is not uncommon for Google. The services Ascension are taking are similarly commonplace- the migration of data to Google Cloud, utilizing suite productivity tools and providing technological tools to Ascension’s doctors for use. What perhaps is the defining factor is the scale, with this being the largest project of its kind to date – managing data of over 50 million Americans. This was dubbed “Project Nightingale”.

Continue reading

Data protection rights

Personal data, such as your name, likeness, birthday or any other information which can be used to identify you is highly sensitive.

Protecting and bringing actions on the basis of your personal data being harvested, used or misused is a key foundational right to privacy. Continue reading

£3billion class action against Google given the go-ahead – Lloyd v Google LLC [2019] EWCA Civ 1599

Mr Lloyd, a consumer protection advocate, brought a claim against Google for damages on behalf of 4m Apple iPhone users. It was alleged that Google secretly tracked some of their internet activity for commercial purposes between 9 August 2011 and 15 February 2012. Continue reading

Copyright

Copyright under English law is primarily established under the Copyright Designs and Patents Act 1988. Copyright can extend to protect videos and images taken by you on your devices.

In such circumstances, these videos and images are protected 70 years from the end of the life of the taker. This can function to protect photographs and videos that you have taken from use by third parties. By enforcing your copyright ownership you can control who has the right to use and edit the images and/or footage in question. This is usually in the form of a cease and desist letter notifying the third party of your ownership of the material whilst asking that they stop usage as soon as possible.

aperture black blur camera

The right to be forgotten does not apply to search engine results globally

On 24 September 2019 the European Court of Justice (“ECJ”) handed down judgment in the case of Google v CNIL C-507/17. The effect of the case was that right to be forgotten requests only need be applied to domain names of Member States and not extra-territorially globally. The case, therefore, has implications for the processing and effectiveness of the right to be forgotten requests, particularly for requestors who seek de-listing of search results from multiple non-EU jurisdictions. Notably, the administrative burden upon search engine operators has been limited by the ruling.

light smartphone macbook mockup

Continue reading

Privacy concerns around Amazon’s Ring

“A home security product upscaled and diversified into law enforcement and integrated with video software brings with it some serious privacy concerns.”

What is the Ring?

door wooden bell old

The Ring is Amazon’s bestselling smart security device product line. The most notable of which is the Ring doorbell which allows users to monitor movement by their front doors, video and receive mobile notifications whenever someone presses the doorbell. Users can also benefit from an App which is installed on their mobile, monitors local news and allows social media style sharing with other Ring users.

Ring additionally offers security services, cross-selling into the wider security service market.

Ring and law enforcement

Recent controversy was sparked when it was found that the Ring in partnering with over 400 police departments in the United States. The extent of the Ring’s collaborative efforts extend to targeting ad words to users encouraging that they share live video feed footage with law enforcement. This in and of itself is a significant extension in police surveillance meriting further legislative scrutiny.

However, pair this with the fact that the Ring’s being dubbed as “the new neighborhood watch”- it becomes a little disconcerting.

It is well-established that people’s likeness is considered personal data and that the recording of individuals without their consent is potentially invasive. There are also civil liberties concerns regarding the police acquiring these live video feeds for their own use.

This has drawn the attention of the Senator for Massachusetts, Edward Markey, who recently published a letter sent to Amazons CEO Jeffery Bezos, highlighting civil liberties concerns with the Ring. This highlights issues previously raised in the United Kingdom in relation to the use of facial recognition software; its potential to racially profile individuals. Whilst this was considered by the Administrative Court to be too an intangible argument lacking sufficient supporting data, further scrutiny would be most welcome.

And it looks like further scrutiny seems forthcoming. In his letter Senator Markey highlights 10 key concerns around the Ring system, demanding a response from the Amazon CEO by 26 September 2019. We highly recommend readers consider the letter in its entirety here.

Police forces use of facial recognition software determined lawful – R (Bridges) v Chief Constable of South Wales Police [2019] EWHC 2341 (Admin)

“The algorithms of the law must keep pace with new and emerging technologies.”- at p[1]

The Facts

The Administrative Court, with Haddon-Cave LJ and Swift J sitting, has heard the first case on the lawfulness of the police using automated facial recognition software (“AFR”). The case concerned the South Wales Police (“SWP”) use of AFR in two instances where they allegedly recorded the image of the Claimant. Once on 21 December 2017 at Queen Street Cardiff and another at the Defence Procurement, Research, Technology and Exportability Exhibition (“the Defence Exhibition”) on 27 March 2018.

Continue reading