LAWS0338 Privacy, Data and Surveillance Law

Dr Michael Veale, UCL Faculty of Laws, 2021-22 Syllabus

Updated January 2023

      ,-.
     / \  `.  __..-,O
    :   \ --''_..-'.'
    |    . .-' `. '.
    :     .     .`.'
     \     `.  /  ..
      \      `.   ' .
       `,       `.   \
      ,|,`.        `-.\
     '.||  ``-...__..-`
      |  |
      |__|
      /||\
     //||\\
    // || \\
 __//__||__\\__
'--------------'

Reading importance labels — Check the labels beside each reading! Compulsory means “please do this before class”. Recommended means “if you’re interested, or if you’re writing an essay or revising this topic, have a look at this”. It does not mean “if you’re a really good student you’ll have done all the recommended reading as well as compulsory reading before class”. Optional means “if you’re writing an essay, or it interest you, this might be something you want look at, but you could also do your own research and find other sources too.”

Open access — Wherever possible, resources are accompanied by an open access link (‘OA link’). Some resources are available freely but only behind for-profit repositories such as SSRN, which heavily push users to register and log-in, and hide the download options for downloading without this in the bottom of the page. These are ‘OA-ish links’. Occasionally, a paper or book is too important not to recommend even though an OA version is unavailable. I have tried to minimise these resources throughout the reading list.

S1: What is Privacy, Data and Surveillance Law, Anyway?

What is privacy all about, and why might we seek it? In this session, we will take a look at some of the different issues privacy might, or has been thought to, protect. The three compulsory readings approach this from different angles: Solove offers a taxonomy to cover a breadth of the issues; Viljoen looks at what ‘data law’ might do, focussing on the way that technologies construct and mediate human relations; while Lynskey consider the developing approach(es) that courts in the UK have taken to a set of rights that have only relatively recently made their way into law in this jurisdiction.

For very different views, the optional readings present different viewpoints. Gürses unpacks how computer scientists often think about privacy; O’Hara tries to understand why people talk over each other when talking about what privacy is or should do; Hildebrandt thinks about how privacy might be theorised in relation to a world of profiling machines; while Warren and Brandeis, in a seminal article, try to locate privacy as a right in the US constitution — influential on later thought, not least as Brandeis became a US Supreme Court justice.

The core questions for this first session are broad, but set the scene for this module. What do you think privacy is or should protect? What should the priorities for law be in an informationalised world, and have courts so far appeared to be up to these challenges?

Articles

  • Compulsory Daniel J Solove, ‘A Taxonomy of Privacy’ (2005–06) 154 U Pa L Rev 477 OA link
  • Compulsory Salomé Viljoen, ‘A Relational Theory of Data Governance’ (2021) 131 Yale Law Journal 573 OA link
  • Compulsory Orla Lynskey, ‘Courts, Privacy and Data Protection in the UK: Why Two Wrongs Don’t Make a Right’ in Courts, Privacy and Data Protection in the Digital Environment (Edward Elgar Publishing 2017) UCL link
  • Recommended Julie E Cohen, ‘What Privacy is For’ (2012–13) 126 Harv L Rev 1904 OA link
  • Recommended Kieron O’Hara, ‘The Seven Veils of Privacy’ (2016) 20 IEEE Internet Computing 86 UCL link
  • Recommended Seda Gürses, ‘Can You Engineer Privacy?’ (2014) 57 Communications of the ACM 20. UCL typeset link / OA preprint link
  • Optional Mireille Hildebrandt, ‘Privacy as Protection of the Incomputable Self: From Agnostic to Agonistic Machine Learning’ (2019) 20 Theoretical Inquiries in Law 83 OA link
  • Optional Samuel D Warren and Louis D Brandeis, ‘The Right to Privacy’ (1890) 4 Harvard Law Review 193 OA link
  • Optional Julie E Cohen, ‘Turning Privacy Inside Out’ (2019) 20 Theoretical Inquiries in Law OA link

S2: Data Protection: Form and Function

In this session, we introduce a parallel but interwoven regime to privacy and private life: data protection. Lynskey introduces where data protection came from, what is does, and how it relates to privacy. Her book was written before the passing of the General Data Protection Regulation, which builds on previous privacy statutes; this is the topic of Hoofnagle and others, who seek to summarise the Regulation. You should also read the GDPR alongside this article as appropriate.

Further readings look at the history of data protection law (González Fuster), and elaborate theoretically on the functioning of data protection and its place within the EU legal order (Lynskey, Ausloos).

Think while reading:

  1. What does data protection seeks to do and protect?
  2. How does it secure the aims we discussed when considering privacy? What other ends does it pursue, or ways might it protect or empower people?
  3. Is it a subset of privacy, or a separate, complementary regime?
  4. How many of the rights and obligations were you aware of, and how does the text of data protection law relate to the practices of firms and governments that you are aware of?
  5. Which areas do you think might hold the most transformative promise, and which the least?

Articles

  • Compulsory Orla Lynskey, ‘The Key Characteristics of the EU Data Protection Regime’ and ‘The Link between Data Protection and Privacy in the EU Legal Order’ in The Foundations of EU Data Protection Law (Oxford University Press 2015) UCL link
  • Compulsory Chris Jay Hoofnagle and others, ‘The European Union General Data Protection Regulation: What It Is and What It Means’ (2019) 28 Information & Communications Technology Law 65 OA link
  • Recommended Gloria González Fuster, ‘The Materialisation of Data Protection in International Instruments’ in The Emergence of Personal Data Protection as a Fundamental Right of the EU (Springer 2014) UCL link
  • Recommended Orla Lynskey, The Foundations of EU Data Protection Law (Oxford University Press 2015) UCL link
  • Optional Jef Ausloos, ‘Foundations of Data Protection Law’ in The Right to Erasure in EU Data Protection Law: From Individual Rights to Effective Protection (Oxford University Press 2020) UCL link

Policy Documents

  • Optional European Court of Human Rights, Guide to the Case Law of the European Court of Human Rights: Data Protection (Council of Europe, updated regularly) (look at relevant parts to bolster your understanding) OA link

Statute

  • Compulsory Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L 119/1. This is the text of the GDPR. Read this alongside the Hoofnagle and others article. You do not need to read all of the recitals yet, and this takes most of the length out; but they do help shine light on the main articles, and are important interpretative tools, so refer back as appropriate.
  • Compulsory Charter of Fundamental Rights of the European Union, articles 7–8.

T1: Privacy Enhancing Technologies and Contact Tracing Apps

In this sesion we’re going to dive into a topical example of a privacy-enhancing technology: the DP-3T system for mobile contact/proximity tracing using Bluetooth. This system was co-developed with UCL Laws, and illustrates a range of tensions between privacy, platforms and power.

Throughout all of this reading, think about the following:

  • What are the concerns about different types of contact tracing? Which concerns are specific to ‘centralised’ systems, and which concerns remain even in a decentralised version?
  • How would data protection apply to centralised or decentralised contact tracing systems? What would be the main differences? (think about the material scope)
  • What are the roles of operating systems and platforms in the governance of privacy?
  • Who might the data controllers of systems such as these contact tracing tools? Who should they be? Are there more than one?
  • Does data protection and/or privacy deal with concerns that remain even in systems powered by privacy-enhancing technologies? If not, what legal interventions might be needed?

Essential reading

Now it’s time to get familiar with privacy-preserving digital contact tracing. I’m setting a range of reading — each piece is quite short (sometimes just a couple of pages) so don’t be overwhelmed by the number of them! One is a cartoon!

Further reading

S3: The Law of Everything? Anonymisation and the Scope of Personal Data

One of the main concepts in data protection is the concept of personal data. The boundaries of this concept have been hotly contested, and are an important point of interaction for law, policy, computer and data science alike; all of these disciplines work on this issue intensely. McAuley describes why computer scientists find anonymisation difficult to achieve. Purtova argues that the CJEU has interpreted the GDPR in an expansive manner which has led an unmanageable array of information types to be classifiable as personal data (but compare to optional reading, Dalla Corte, who critiques this reading). Elliot and others propose a different approach, which looks at the risk of data to be reidentified in its environment. What would be the benefits or risks of adopting this approach? You should also read the Breyer case, which is looked at considerably in the Purtova article.

You may also choose to read further reading, such as a technical analyis of why it is difficult or impossible to anonymise some types of location data from the perspective of computer science by de Montjoye and others, the application to smart environments and technologies by Gellert. You should also consider the household exemption, part of the scope of data protection law, which is importantly limited by the cases Lindqvist and Ryneš.

How should a controller in practice go about considering what is personal data or not? How might this differ for different types of data — text; location; tabular data; video data or photographs? Is there a good balance between personal or non personal data classification that is possible, or will any approach inevitably be gamed and abused? If so, is there a way out of this quandary?

While reading it all, as well as when you have read both the papers and the cases, think through the following questions:

  1. What is it about data that makes anonymisation particularly hard?
  2. Can you think of a good example of data which is reliably non-personal under data protection law?
  3. What sort of test does Breyer imply? What kind of capacities might data controllers need to carry out this test?
  4. Is this approach to personal data a sensible one for data protection law? What might be the challenges if a narrower approach was taken? What about a broader one?
  5. Is the household exemption too narrow in scope?

Statute

European Union

  • Compulsory GDPR, recitals 26-30, arts 2, 4(1).

Videos

Articles

Cases

European Union

  • Compulsory Case C-582/14 Patrick Breyer v Bundesrepublik Deutschland ECLI:EU:C:2016:779 (on the identifiability of personal data)
  • Recommended Case C‑434/16 Nowak ECLI:EU:C:2017:994 (on exam scripts and comments)
  • Recommended Case C-101/01 Lindqvist EU:C:2003:596 (on the scope of the household exemption)
  • Recommended Case C‑212/13 Ryneš ECLI:EU:C:2014:2428. (on a CCTV camera on a house and the household exemption)

Pending Cases

  • Case C-604/22 IAB Europe (on whether a compliance mechanism for online tracking comprises personal data)
  • Case C-659/22 Ministerstvo zdravotnictví (Czech Ministry of Health) (on whether scanning vaccination certs amounts to processing)
  • Case C-115/22 NADA and Others (on health data and doping)
  • Case C-446/21 Schrems (on sexuality and special category data in advertising)

United Kingdom

  • Optional Durant v Financial Services Authority [2003] EWCA Civ 1746.
  • Optional Edem v The Information Commissioner & Anor [2014] EWCA Civ 92
  • Optional Secretary of State for the Home Department & Anor v TLU & Anor [2018] EWHC 2217 (QB).

S4: Revenge of the Cookie Monster

Cookie pop ups when you browse the web are hardly anything new to most Internet users, particularly in Europe.  But what exactly are cookies? How do they something that we should be concerned about? How does the law understand cookies, and how has that developed over time? In the session, we’re going to try and get some answers to some of these questions. While reading and examining the below, it will be good to think about the following questions:

  1. What conceptual role do cookies play in online tracking and profiling?
  2. Consent is a large part of the way that we govern cookies in Europe. But is it a good way to do this? What would the alternatives look like, and in what ways would they be better or worse? 
  3. If many of the ways the cookies are used on the Internet are currently illegal, why hasn’t anything been done about it? What are the main challenges and regulatory hurdles that prevent affective enforcement?

We’ll then look at controllership — this matters in online tracking — who is responsible? We’ve seen that tracking online has a huge number of players, and in many ways resembles a complex ecosystem of unclear legality. But who should be responsible for it? Or similar issues online with so many players and layers of responsibility? Courts are increasingly facing these questions — and their answers are subject to criticism. The topic of data controllership has emerged as a difficult sticking point here, and one that we will look at to better understand data protection law. What happens when it seems that those running websites or apps have, in effect lost (or never had) practical control? Are they expected to be liable for data processing they enable through partnerships with other Internet services or plugins? Or should they have no responsibility, creating a moral hazard where they can install anything they like without fear or repercussion?

When reading about controllership, think of the following questions:

  1. Is the idea of effective and complete protection built up by the CJEU leading to a world where everyone is a controller, but no-one is responsible?
  2. What would the adequate level of responsibility be for, say, a news website using a Facebook plugin that enables Facebook to track its users? What should happen if Facebook misuse that data?
  3. Should standard setters (arguably like Jehovah’s Witnesses) be considered controllers? Is this a useful way to govern complex systems (by finding their coordinating actors and holding them to account) or is it unfair to go after rule-makers that may have themselves not directly misused data? What kind of regime should might strike an appropriate balance?

Articles

Policy Documents

  • Optional Information Commissioner’s Office, ‘Update Report into Adtech and Real Time Bidding’ (Information Commissioner’s Office, 20 June 2019) link

Cases

European Union

  • Compulsory Case C-210/16 Wirtschaftsakademie Schleswig-Holstein ECLI:EU:C:2018:388.
  • Compulsory Case C-49/17 Fashion ID GmbH & CoKG v Verbraucherzentrale NRW eV ECLI:EU:C:2019:629.
  • Optional Case C‑25/17 Jehovan todistajat ECLI:EU:C:2018:551.
  • Optional Case C-673/17 Planet49 GmbH ECLI:EU:C:2019:801.

United Kingdom

  • Optional Vidal-Hall v Google Inc [2015] EWCA Civ 311.
  • Optional Lloyd v Google LLC [2021] 3 UKSC 50.

S5: Forget Me, or Forget Me Not? The Right to Erasure

The “right to be forgotten” is famous, and contentious. In this session, we’ll be separating fact from fiction, and try to dig further into questions of who can be forgotten online, how and why.

Think about the following while reading

  • Is the right to erasure the “ultimate power tool” for data subjects? Or is it not quite as empowering as it sounds?
  • How does data protection law balance erasure with reasons for non-erasure? Who determines this balancing in practice — and what would be needed to ensure that it is fair?
  • How does the CJEU build on the issues of responsibilities of search engines in its case law subsequent to Google Spain?
  • What are the differences between the right to object, and the right to erase? Which is more powerful?
  • Was Google Spain a good judgment? How does Google’s role as data controller square with their role as an intermediary benefitting from liability shields?

Have a look at the following tools from Google:

Articles

  • Compulsory Andrés Guadamuz, ‘Developing a Right to Be Forgotten’ in Tatiana-Eleni Synodinou and others (eds), EU Internet Law: Regulation and Enforcement (Springer 2017). OA-ish link UCL paywall link
    • An overview of some of the debates and sides people take concerning the Right to Be Forgotten.
  • Recommended Aleksandra Kuczerawy and Jef Ausloos, ‘From Notice-and-Takedown to Notice-and-Delist: Implementing Google Spain’ (2015–16) 14 Colo Tech LJ 219. OA link
    • A discussion of some of the roles (as they were and are emerging) of different governance actors in the run up to, and in the wake of, the judgment in Google Spain.
  • Recommended Jef Ausloos, The Right to Erasure in EU Data Protection Law: From Individual Rights to Effective Protection (Oxford University Press 2020). UCL link
  • Recommended Theo Bertram and others, ‘Five Years of the Right to Be Forgotten’ in (ACM 2019) Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security 959. OA link
    • A scholarly article on how the RTBF has panned out from the point of view of Google employees, who authored it. Read alongside the firm’s regularly updated, quantitative Transparency Report into EU delisting on the basis of the right.
  • Recommended Joris Van Hoboken, ‘Search Engine Freedom’ in Search Engine Freedom: On the Implications of the Right to Freedom of Expression for the Legal Governance of Web Search Engines (University of Amsterdam 2012) pp 168-213. OA link
    • A discussion of the theoretical and legal manners in which freedom of expression applies to search engines, given their crucial role in enabling access to information. Pre-dates Google Spain, although not the debates about the RTBF.
  • Optional Jean-François Blanchette and Deborah G Johnson, ‘Data Retention and the Panoptic Society: The Social Benefits of Forgetfulness’ (2002) 18 The Information Society 33. OA link paywalled, typeset link
    • A less legal view of why we might want a right to be forgotten from the standpoint of privacy.
  • Optional Tarleton Gillespie, ‘To Remove or to Filter?’ in Custodians of the Internet (Yale University Press 2018). UCL link (paywalled)
    • A broader discussion of the different tactics that internet intermediaries, and particularly platforms, use to limit the distribution of content online.
  • Optional David Erdos, ‘The “Right to Be Forgotten” beyond the EU: An Analysis of Wider G20 Regulatory Action and Potential Next Steps’ (2021) 13 Journal of Media Law 1. OA-ish link UCL link
    • Examines how similar rights to be forgotten work in jurisdiction including Canada, Turkey and Australia.
  • Optional Stefan Kulk and Frederik Zuiderveen Borgesius, ‘Privacy, Freedom of Expression, and the Right to Be Forgotten in Europe’ in Evan Selinger and others (eds), The Cambridge Handbook of Consumer Privacy (Cambridge University Press 2018). OA-ish link UCL paywall link
    • A useful introduction to how freedom of expression is balanced by the CJEU and ECtHR, focussing on the right to be forgotten. Overlaps otherwise in terms of content with Guadamuz 2017.

Policy Documents

  • Optional European Data Protection Board, Guidelines 5/2019 on the criteria of the Right to be Forgotten in the search engines cases under the GDPR (EDPB 2020) OA link
    • Guidance from the European regulators in charge of enforcing Google Spain and the Right to Erasure. See also the commentary on these guidelines by David Erdos, University of Cambridge.
  • Optional Access Now, Understanding the Right to Be Forgotten Globally (Access Now 2017) OA link
    • A short policy paper from an NGO indicating the surrounding conditions and a safeguard wishlist for global implementation of a right to be delisted on privacy grounds.

Statute

European Union

  • Optional Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L 119/1, art 17.

Case Law

European Union

  • Compulsory Case C-131/12 Google Spain SL and Google Inc v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González ECLI:EU:C:2014:317.
  • Compulsory Case C‑136/17 GC and Others v Commission nationale de l’informatique et des libertés (CNIL) ECLI:EU:C:2019:773.
  • Compulsory Case C‑507/17 Google LLC v Commission nationale de l’informatique et des libertés (CNIL) ECLI:EU:C:2019:772.
  • Compulsory C-460/20 Google (Déréférencement d’un contenu prétendument inexact) ECLI:EU:C:2022:962

United Kingdom

  • Recommended NT1 & NT2 v Google LLC [2018] EWHC 799 (QB). BAILII

T2: Data Rights

For this tutorial, there is a short preparatory activity we will use as a basis for discussion about the right of access. Carrying out steps 1-2 are essential, but step 3 is optional (writing further to a company to request your data). I advise doing it, however, as it is free, and interesting to see how they interact with you, and what you get back.

  1. Try to use a “download my data” tool” on one or more services that you use. Some examples of those from the largest services are below:
  1. If you get a copy of the data, try to locate the privacy policy for that company. Privacy policies are usually implemented as to provide some of the information required in Article 13 of the GDPR (have a look). Re-read articles 13–15, and article 20, of the GDPR. Is this all the data you requested? What personal data might be missing? Do you also think (e.g. from your usage of a service) that the controller might have additional data about you they are not telling you about?

  2. Write an email or other message to the firm (how should be detailed in the privacy policy) asking for a full copy of you data, highlighting any omissions you discovered or suspected based on point 2. I have made a template for a very full version of how to do this here, but you are welcome to just type something shorter or more focussed.

Save all the files you get from this process in one place (e.g. a copy of relevant parts of the privacy policy, and any data), so we can talk about the process during the tutorial. You won’t hear back from the companies yet as part of step 3, but we can revisit this.

In the tutorial, we’ll be talking about the following:

  • What is the right of access for? What about the right of portability? What are the differences?
  • Does it work? What characteristics would you need a right of access to have to meet its purposes?
  • Is there any data that would be difficult for a company to provide access to?

Statute

  • Compulsory GDPR, arts 12, 15, 20.

Policy Documents

  • Compulsory European Data Protection Board, ‘Guidelines 01/2022 on data subject rights - Right of access’ (Version for Public Consultation, 2022) OA link
  • Optional Information Commissioner’s Office, Guidance on the Right of Access (2021)

Articles

Case Law

European Union

  • Optional Case C‑434/16 Nowak ECLI:EU:C:2017:994
  • Optional Joined Cases C‑141/12 and C‑372/12 YS and Others ECLI:EU:C:2014:2081.

Advocate General Opinions

  • Optional Case C‑487/21 Österreichische Datenschutzbehörde and CRIF ECLI:EU:C:2022:1000 (Opinion of Advocate General Pitruzzella)
  • Optional Case C‑154/21 Österreichische Post (Informations relatives aux destinataires de données personnelles) ECLI:EU:C:2022:452 (Opinion of Advocate General Pitruzzella)

England and Wales

  • Optional Dawson-Damer & Ors v Taylor Wessing LLP [2017] EWCA Civ 74.
  • Optional Ittihadieh v 5-11 Cheyne Gardens RTM Company Ltd & Ors [2017] EWCA Civ 121.
  • Optional DB v General Medical Council [2018] EWCA Civ 1497.
  • Optional Durant v Financial Services Authority [2003] EWCA Civ 1746. (note that this case is important to read in the context of subsequent European cases, including Nowak)
  • Optional Rudd v Bridle & Anor [2019] EWHC 893 (QB).
  • Optional R (Open Rights Group & Anor) v SSHD [2021] EWCA Civ 800

S6: Computer Says No? Algorithmic Decisions

Computers are intermediating the content of decisions now, rather than just transmitting them. The governing of algorithms and automated systems is big news, and big business for some. But what role for data protection law? In this session, we’ll dig into the details.

When doing the reading, think about the following:

  • What are the main issues concerning algorithms that require governance? What has the last few years indicated are the most pressing of them, and which might be the most pressing in the future?
  • Should we be worried about algorithms, or “decisions”, both or neither?
  • How can we reconcile the purpose of Article 22 with the rest of the GDPR? Does it connect to form a coherent whole, or does it not make sense? Is it best conceived as part of data protection, privacy, both or neither?
  • Is Article 22 a relic that will fail to govern algorithms going forward, or is there hope that it can be usefully repurposed? If so, as a simple safety net, or an active tool of governance?
  • Are there simple changes, or court interpretations, that might make the governance of algorithms through the GDPR function more effectively? Which? Are they likely without new legislation?

Articles

  • Compulsory Lilian Edwards and Michael Veale, ‘Slave to the Algorithm? Why a “Right to an Explanation” Is Probably Not the Remedy You Are Looking For’ (2017) 16 Duke Law & Technology Review 18 OA link
  • Compulsory Mireille Hildebrandt, ‘Privacy as Protection of the Incomputable Self: From Agnostic to Agonistic Machine Learning’ (2019) 20 Theoretical Inquiries in Law 83. OA link
  • Compulsory Andrew D Selbst and others, ‘Fairness and Abstraction in Sociotechnical Systems’ in Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT* ’19, New York, NY, USA, ACM 2019). OA link
  • Recommended Reuben Binns and Michael Veale, ‘Is that Your Final Decision? Multi-Stage Profiling, Selective Effects, and Article 22 of the GDPR’ [2021] International Data Privacy Law OA link
  • Compulsory Margot E Kaminski, ‘Binary Governance: Lessons from the GDPR’s Approach to Algorithmic Accountability’ (2019) 92 Southern California Law Review OA link
  • Recommended Andrew Selbst and Solon Barocas, ‘The Intuitive Appeal of Explainable Machines’ (2018) 87 Fordham Law Review 1085 OA link.
  • Optional Margot E Kaminski, ‘The Right to Explanation, Explained’ (2019) 34 Berkeley Technology Law Journal OA link
  • Optional Luca Tosoni, ‘The Right to Object to Automated Individual Decisions: Resolving the Ambiguity of Article 22(1) of the General Data Protection Regulation’ (2021) 11 International Data Privacy Law 145. OA-ish link UCL link
  • Optional Andrew D Selbst and Julia Powles, ‘Meaningful Information and the Right to Explanation’ (2017) 7 International Data Privacy Law 233. OA link
  • Optional Sandra Wachter and others, ‘Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation’ (2017) 7 International Data Privacy Law 76. OA link

Policy Documents

  • Compulsory Article 29 Working Party, ‘Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679 (WP251rev.01)’ (6 February 2018) alongside article 15, 22, recital 71, GDPR.
  • Optional Information Commissioner’s Office, ‘Guidance on AI and data protection’ (2021) link

Statute

European Union

  • Compulsory General Data Protection Regulation (GDPR), articles 15, 21, 22, recital 71.

Case-Law

European Union

Pending

  • Case C-634/21 SCHUFA Holding and Others (scoring) (on downstream algorithmic decisions)
  • Case C-203/22 Dun & Bradstreet Austria

S7: State Surveillance: Introducing Bulk Powers

In 2013, documents leaked by Edward Snowden had a transformative effect on law, business practices, and perceptions. In this first surveillance seminar, we will try to look at the bigger picture, of technologies being used and some of the human rights fights around the old and new regimes.

When reading, consider the following:

  • What kind of impact on privacy is there from using content of communications? What kind from metadata?
  • How might you conceptualise the ECtHR’s trend in attitudes towards bulk collection/mass surveillance regimes?
  • What are the geopolitical implications of bulk collection, interception and interference practices? How does this relate to the jurisdiction and physical location of cloud companies?
  • Intelligence agencies commonly argue that bulk regimes are not particularly invasive as only some material is ever selected and read by humans. Do you consider that a sufficient safeguard?
  • What does surveillance of these types tell us about the role of intermediaries? What powers and responsibilities do they have; how have they used these; and how should they use them?

Articles

Policy Documents

  • Compulsory David Anderson, Report of the Bulk Powers Review (Her Majesty’s Stationery Office 2016). OA link
  • Compulsory Unknown Author (OPC-MCR/GCHQ), ‘HIMR Data Mining Research Problem Book’ (Contained in the Snowden Leaks, 20 September 2011) OA link
    • Skim the top-secret research ‘open question’ problem book of the Heilbron Institute for Mathematical Research (University of Bristol), a GCHQ funded research centre that works on highly classified research. This book, part of the Snowden leaks, describes the kind of data and practices that GCHQ have and provide to the researchers that develop the techniques to analyse it. (some of this is very technical, I recommend pages 7-15, 51-53, 67-68. The data sources at the end, particularly pp 69–74 may also be interesting)**
  • Recommended David Anderson, A Question of Trust (Her Majesty’s Stationery Office 2015). OA link
  • Optional Caspar Bowden, The US Surveillance Programmes and Their Impact on EU Citizens’ Fundamental Rights (European Parliament 2013) OA link

Videos

  • Recommended Caspar Bowden, The Cloud Conspiracy 2008-14 (The 31st Chaos Computer Congress, 31C3 2014) OA link

Databases

  • Compulsory Browse the Snowden Archive. Look by Program, and for each entry, you can click on the news article analysing it at the bottom, or access the original PDF of the document itself. Note: if the link there does not work, an alternative, although less nice-to-browse, respository of documents can be found at the Internet Archive.

T3: Big Brother Watch

In this tutorial, we will be looking at Big Brother Watch and Others v UK.

Specifically, we will look at the following questions:

  1. What were the alleged violations of Article 8 in Big Brother Watch? (In particular, what were the three discrete regimes at issue in the case?). What are the requirements for finding a violation of Art 8? How did Article 10 feature in the case?
  2. Do you agree with the way the court analysed the intrusiveness of metadata? Does the acquisition of metadata deserve an equivalent level of protection to the content of communications
  3. To what extent would you describe BBW as a pyrrhic victory for the applicants?
  4. How much do you sympathise with the dissents? Is the trajectory of the ECtHR fit for the 21st century?

Required

S8: State Surveillance: Bulk Powers and Human Rights

Arguably the most control over state surveillance is exercised by the European Convention on Human Rights. In this session, we’re going to look at how it has understood state surveillance across the years, and some of the key tensions it has faced going forward.

  1. How can we characterise the way the ECtHR has or has not changed in relation to the changing nature of surveillance?
  2. Should courts play closer attention to the substance of surveillance powers? How?
  3. Is a bulk collection or interception regime a natural progression or a disjunct leap from targeted collection/interception? What is the view of the ECtHR on this?
  4. Should individuals be notified that they have been subject to surveillance measures, when it would not undermine investigations to do so?

Readings

  • Compulsory Eleni Kosta, ‘Surveilling Masses and Unveiling Human Rights: Uneasy Choices for the Strasbourg Court’ (Inaugural Address, Tilburg Law School, 2017) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3167723>.
  • Recommended Nóra Ní Loideáin, ‘A Bridge Too Far? The Investigatory Powers Act 2016 and Human Rights Law’ in Lilian Edwards (ed), Law, Policy, and the Internet (Hart Publishing 2019). UCL link
  • Recommended Bernard Keenan, ‘The Evolution Of Elucidation: The Snowden Cases Before The Investigatory Powers Tribunal’ [2021] The Modern Law Review. UCL link
  • Recommended Théodore Christakis and Katia Bouslimani, ‘National Security, Surveillance, and Human Rights’, in the Oxford Handbook of the International Law of Global Security (OUP 2021). UCL link
  • Recommended Eleni Kosta, ‘Algorithmic State Surveillance: Challenging the Notion of Agency in Human Rights’ (2020) Regulation & Governance. OA link
  • Recommended European Court of Human Rights, Guide on Art 8 of the European Convention of Human Rights (Council of Europe, updated regularly) (look at relevant parts to bolster your understanding) OA link
  • Optional Andrew Murray, ‘State Surveillance and Data Retention’ in Information Technology Law (Oxford University Press 2019). (a more simplified high-level overview)
  • Optional Bart van der Sloot and Eleni Kosta, ‘Big Brother Watch and Others v UK: Lessons from the Latest Strasbourg Ruling on Bulk Surveillance Case Notes’ (2019) 5 Eur Data Prot L Rev 252.
  • Optional Pierre Notermans, ‘Surveillance Measures and the Exception of National Security in the Case Law of the European Court of Human Rights’ in Human Rights in Times of Transition (Edward Elgar Publishing 2020) UCL link

Cases

  • Compulsory Big Brother Watch and Others v the United Kingdom (Grand Chamber) ECLI:CE:ECHR:2021:0525JUD005817013.
  • Optional Malone v the United Kingdom ECLI:CE:ECHR:1984:0802JUD000869179.
  • Optional S and Marper v the United Kingdom ECLI:CE:ECHR:2008:1204JUD003056204.
  • Recommended Privacy International v Secretary of State for Foreign And Commonwealth Affairs & Ors [2016] UKIPTrib 15_110-CH.
  • Recommended Liberty & Ors v GCHQ & Ors [2014] UKIPTrib 13_77-H and Liberty & Ors v The Secretary of State for Foreign And Commonwealth Affairs & Ors [2015] UKIPTrib 13_77-H.

S9: Data Retention - the Never Ending Story

 __i ----------- ?
|---|    
|[_]|    
|:::|    
|:::|    
`\   \   
  \_=_\ 

Who called who, and when? When governments wish to understand what communication individuals had with each other in the past, they want somewhere to look back upon for reference. But keeping hold of all that data is expensive, and telecoms companies are loathe to do it. Enter data retention, a dynamic area of law characterised mainly by an ongoing battle between the CJEU and EU member states as to its permissible extent.

  • Why does data retention fall within EU law? Should it?
  • Has the Court of Justice gone too far, or has the ECHR not gone far enough?
  • Do you agree that metadata might have a chilling effect on freedom of expression?
  • What do you think Internet Communication Records are in the IPA? Does retaining these present additional human rights concerns? (hint: this was a novelty not present in RIPA).
  • How do the safeguards proposed in La Quadrature du Net compare to other aspects of the intelligence regime (e.g. in the UK)? Which might be easy to carry out, and which might be more difficult, or represent a significant change from e.g. the safeguards in the IPA?

Articles

  • Compulsory Marcin Rojszczak, ‘National Security and Retention of Telecommunications Data in Light of Recent Case Law of the European Courts’ [2021] European Constitutional Law Review. OA link
  • Compulsory Eleni Kosta, ‘The Retention of Communications Data in Europe and the UK’ in Lilian Edwards (ed), Law, Policy, and the Internet (Hart Publishing 2019) OA link
  • Recommended Marcin Rojszczak, ‘The Uncertain Future of Data Retention Laws in the EU: Is a Legislative Reset Possible?’ (2021) 41 Computer Law & Security Review 105572. UCL link OA link
  • Optional Privacy International, ‘National Data Retention Laws since the CJEU’s Tele-2/Watson Judgment. A Concerning State of Play for the Right to Privacy in Europe’ (Privacy International, September 2017) OA link

Cases

European Union

  • Compulsory Joined Cases C‑511/18, C‑512/18 and C‑520/18 La Quadrature du Net and Others ECLI:EU:C:2020:791.
  • Recommended Case C-623/17 Privacy International ECLI:EU:C:2020:790.
  • Recommended Case C-140/20 Commissioner of An Garda Síochána ECLI:EU:C:2022:258
  • Optional Case C-793/19 SpaceNet ECLI:EU:C:2022:702
  • Optional Case C‑746/18 HK v Prokuratuur ECLI:EU:C:2021:152.
  • Optional Case C-350/21 Spetsializirana prokuratura ECLI:EU:C:2022:896
  • Recommended Joined Cases C‑293/12 and C‑594/12 Digital Rights Ireland and Others ECLI:EU:C:2014:238.
  • Recommended Joined Cases C-203/15 and C-698/15 Tele2 Sverige AB v Post- och telestyrelsen and Secretary of State for the Home Department v Tom Watson and Others ECLI:EU:C:2016:970.

United Kingdom

  • Optional R (on the application of Davis and others) v The Secretary of State for the Home Department [2015] EWHC 2092 (Admin)
    • and in the EWCA: Secretary of State for the Home Department v Watson MP & Ors [2018] EWCA Civ 70.

Statute

  • Recommended Investigatory Powers Act 2016 part 4 (Retention of Communications Data) link.

S10: Data Transfers

Note if reading ahead: I expect some further changes if and when the EU sign an agreement with the United States, and if the Irish DPC enact a suspension of data flows concerning Facebook, and so these reading will slightly change.

  • Compulsory Christopher Kuner, ‘Reality and Illusion in EU Data Transfer Regulation Post Schrems’ (2017) 18 German Law Journal 881. OA link
  • Optional Christopher Kuner, ‘Schrems II Re-Examined’ (Verfassungsblog, 25 Aug 2020) OA link
    • Recommended Read alongside Douwe Korff, ‘Comments on Prof. Chris Kuner’s Blog Schrems II Re-Examined of 25 August 2020’ (26 August 2020) OA-ish link (alt link)
  • Optional Barbara Sandfuchs, ‘The Future of Data Transfers to Third Countries in Light of the CJEU’s Judgment C-311/18 – Schrems II’ (2021) GRUR Int ikaa204 UCL link
  • Recommended Andrew D Murray, ‘Data Transfers between the EU and UK Post Brexit?’ (2017) 7 International Data Privacy Law 149 OA link
  • Optional Graham Greenleaf, ‘Japan: EU Adequacy Discounted’ (2018) 155 Privacy Laws & Business International Report 8-10 OA link

Policy Documents

  • Recommended Privacy International, ‘Secret Global Surveillance Networks’ (Privacy International, 2018) OA link
  • Optional European Data Protection Board, ‘Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data, Version 2’ (18 June 2021) OA link

Statute

  • Compulsory GDPR, chapter V.
  • Recommended Commission Implementing Decision (EU) 2021/1773 of 28 June 2021 pursuant to Directive (EU) 2016/680 of the European Parliament and of the Council on the adequate protection of personal data by the United Kingdom (notified under document C(2021) 4801) OJ L360/69. OA link

Cases

European Union

  • Compulsory Case C-311/18 Data Protection Commissioner v Facebook Ireland and Schrems ECLI:EU:C:2020:559 (“Schrems II”) link
  • Recommended Case C-362/14 Maximillian Schrems v Data Protection Commissioner ECLI:EU:C:2015:65 (“Schrems I”)

Ireland

  • Optional The Data Protection Commissioner -v- Facebook Ireland Ltd & Anor [2017] IEHC 545 (Ireland)
    • This case is very useful in restating the facts of data transfers in the context of Facebook and US surveillance under FISA 702 and EO 12333. It is the case which led to the questions being referred to the CJEU in Schrems II.

Acknowledgments

ASCII art from [https://www.asciiart.eu/space/telescopes], credited to SSt.