LAWS0366 Internet Law and Policy

Convened by Dr Michael Veale, co-taught with Professor Orla Lynskey, Dr Bernard Keenan


This is a 10-week module delivered to final year LLB students at the Faculty of Laws, University College London in the 2024-5 academic year. We provide the syllabus online to support people learning and teaching these topics around the world, and hope that it encourages colleagues elsewhere to make their materials similarly available.

 

L1: Welcome to the Internet (Yes, We Have Cats) 2

L2: How To Control the Internet 3

L3: Intermediaries I: Liability for Online Content 4

T1: D’oH! Code, Law and Politics of Encrypted DNS  8

L4: Intermediaries II: Moderation, Algorithms and General Monitoring  10

L5: Intermediaries III: Platform Regulation and Regulating-by-Design  12

T2: Regulating Recommending  14

L6: Data Protection: Form, Function and Reach  17

L7: Data Protection’s Substance: Cutting to the Core  20

T3: Data Rights and Wrongs  22

L8: Being Forgotten Online  24

L9: Bidding for Attention: Online Tracking and Data Protection Law   26

T4: Let’s Do It At My Place Instead? Challenges of Encrypted Computation In Advertising and Beyond  29

L10: Artificial Intelligence and Data Protection  30 

L1: Welcome to the Internet (Yes, We Have Cats)

Dr Michael Veale

We all use the Internet. But how many people know what it is; where it came from; how it works? This session will introduce the history of the Internet and the functioning of core Internet technologies. Why were they designed as they were; and with whose values at the core? We will start to see why and how these design decisions interact with law and policy concerning the online world; a theme that will recur throughout the module.

Learning Objectives

·       What are the core technologies underpinning the Internet and how (in very rough terms) do they work?

·       What is the difference between the Internet and the Web?

·       What technical design principles underpin the Internet? Whose values do they reflect? How do these compare to broader values, for example those reflected in human rights regimes?

·       What are some of the main actors and organisations that govern the Internet, and how do they make decisions?

·       What is ‘cyberlibertarianism’?

 

Note   As legal scholars we have to understand technologies enough to be able to reason about how law and policy applies to them, not to be able to deploy them ourselves. So don’t worry if you don’t understand everything, or it seems too technical — focus on trying to get a general understanding.

Readings

Compulsory         Andrés Guadamuz, ‘Internet Regulation’ in Lilian Edwards (ed), Law, Policy, and the Internet (Hart Publishing 2019)

A useful and broad-ranging introduction into what it is to study Internet law and policy.

Required       Corinne Cath-Speth, ‘Internet Histories: Partial Visions of People and Packets’ in Corinne Cath-Speth, Changing Minds and Machines: A Case Study of Human Rights Advocacy in the Internet Engineering Task Force (IETF) (DPhil Thesis, Oxford University, 2021) pages 27-51.

Traces and critiques the cultural and historical values behind the Internet engineers often seen as the main characters in a predominantly white, male, biographical history of the Internet.

Required       William Lehr and others, ‘Whither the Public Internet?’ (2019) 9 Journal of Information Policy 1. read: pages 1-20

This article very usefully unpacks three different ways of understanding what “the Internet” is. Parts of this article use some difficult terminology: it is not key you understand it all.

Required       Malte Ziewitz and Ian Brown, ‘A Prehistory of Internet Governance’ in Research Handbook on Governance of the Internet (Edward Elgar Publishing 2013).

A history of the Internet and its early governance, as well as the institutions involved. Perhaps the best of the many “great men” tales of Internet history critiques by Cath-Speth (2021).

Further                  Kieron O’Hara and Wendy Hall, Four Internets: Data, Geopolitics, and the Governance of Cyberspace (Oxford University Press 2021). paywall link / UCL link

A recent overview of the way in which the Internet as we know it is diverging as different nations and jurisdiction have different views on its development and trajectory.

L2: How To Control the Internet

Dr Michael Veale

The Internet is sometimes described by politicians as a “Wild West”; a regulation-free land that has long resisted control. According to some, it needs bringing to heel. This is an overly simplistic story. In this session, we will build on the previous session’s understanding of Internet technologies and their design principles to think both about the ways in which this network has resisted control, and the ways in which it has been controlled by both public and private actors.

Learning Objectives

·       Does code ‘regulate’? How does this differ from the types of regulation that lawyers are more familiar with discussing?

·       What does it mean to say the Internet is ‘generative’? What are the policy implications of generative technologies?

·       Which actors are well-positioned to regulate the Internet, and why? What factors make it easier or harder for these actors to alter the working of these networks?

Articles

Compulsory         Jonathan L Zittrain, ‘The Generative Internet’ (2006) 119 Harv L Rev 1974. OA link

This article is in some ways a prediction of how ‘walled gardens’ online might emerge. Was Zittrain right?

Compulsory         Julie E Cohen, ‘“Piracy,” “Security,” and Architectures of Control’ in Configuring the Networked Self: Law, Code, and the Play of Everyday Practice (Yale University Press 2012). OA link

A strong critique of a variety of ways of looking at ‘code’ and ‘law’ - this might be worth coming back to after we talk about these topics some more as it’s a very dense and rewarding read.

Required       Clément Perarnaud and others, ‘“Splinternets”: Addressing the Renewed Debate on Internet Fragmentation’ (European Parliamentary Research Service, 2022).

Further                  Lawrence Lessig, Code: Version 2.0 (Basic Books 2006) OA link

I suggest looking at pages 61-137 (chapters 5-7). Lessig is referred to a lot in the other readings, and if writing about him you should have a look at what he says in his own words.

Further                  Niels ten Oever, ‘“This is Not How We Imagined It”: Technological Affordances, Economic Drivers, and the Internet Architecture Imaginary’ (2021) 23 New Media & Society 344. OA link

This paper is good for those interested in some of the ways in which the technical sides of standards are value laden and contested. Further readings in this vein can be found in Beatrice Martini, ‘Internet Infrastructure and Human Rights: Reading List’ (Stanford PACS, 2020).


 

L3: Intermediaries I: Liability for Online Content

Dr Michael Veale

Who is responsible for what happens on the Internet? The companies with the biggest pockets often sit in between the creator of content and the user viewing it. Furthermore, it can be difficult to identify who says what online, particularly where they are outside of jurisdiction, or do so in a technical way to mask their identity. Yet many times online people — with good or nefarious motives — want content to go away. The obvious actors to go after in this setting are the middlemen — a company like Google or Facebook who in theory have the ability and resources to stop content being distributed. However, this has a fraught history, and if such companies were responsible for everything, it would be unlikely that they would want to promote expression particularly openly and freely. Since the late 1990s, law has responded, creating a regime of shielding for many companies that some hail as central to the development of the internet and others lament has been the foundation of a culture of online irresponsibility. We’ll look at the origins and underlying legal structures in this session, before moving on in future sessions to look at more complex elements of this landscape and how they have developed over time.

Learning Objectives

·       What was the initial logic behind intermediary liability laws? Was that logic legitimate at the time, and does it remain so today?

·       What factors have the CJEU indicated do not compromise the shielding of intermediaries? Do you agree with these judgments?

·       What is the concept of the CJEU has built? Are modern platforms neutral in this way?

Articles and Chapters

Compulsory         Lilian Edwards, ‘“With Great Power Comes Great Responsibility?”: The Rise of Platform Liability’ in Lilian Edwards (ed), Law, Policy, and the Internet (Hart Publishing 2019). UCL link

This chapter looks over the history of intermediary liability law. Does not quite go up to present day; particularly since then the Digital Services Act, Copyright in the Digital Single Market Act, and Terrorist Content Regulation (all in the EU); and in the UK, the Online Safety Bill, all interact with this.

Required       Philippe Jougleux, ‘Intermediaries’ Liability: Where Is My Chair?’ in Philippe Jougleux (ed), Facebook and the (EU) Law: How the Social Network Reshaped the Legal Framework (Springer International Publishing 2022)

A slightly more up to date overview than the Edwards chapter above, with EU focus.

Required       Aleksandra Kuczerawy, ‘From “Notice and Take Down” to “Notice and Stay Down”: Risks and Safeguards for Freedom of Expression’ in The Oxford Handbook of Intermediary Liability Online (Oxford University Press 2020). UCL link

Required       Folkert Wilman, ‘Between preservation and clarification: The evolution of the DSA’s liability rules in light of the CJEU’s case law’ in Joris Van Hoboken and others (eds) Putting the DSA into Practice: Enforcement, Access to Justice, and Global Implications (Verfassungsbooks 2023).

Further                  Jennifer Urban and others, ‘Notice and Takedown in Everyday Practice’ (UC Berkeley Public Law Research Paper No. 2755628, 2017)

Read the introduction & executive summary (pp 1-13) and delve into the bits that interest you - it’s a long report.

Further                  Ben Wagner and others, ‘Regulating Transparency? Facebook, Twitter and the German Network Enforcement Act’ (2020) Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. OA-ish link / UCL link / 8 min talk on the paper

Further                  Sophie Stalla-Bourdillon, ‘Internet Intermediaries as Responsible Actors? Why It Is Time to Rethink the E-Commerce Directive as Well’ in Mariarosaria Taddeo and Luciano Floridi (eds), The Responsibilities of Online Service Providers (Springer 2017). UCL link

Further                  Graham Smith, ‘5.6 Liability of Online Intermediaries’ in Internet Law and Regulation (Sweet and Maxwell 2020) Book available on Westlaw UK

Part of a practitioner textbook on Internet Law. Goes into heavy detail on the jurisprudence in the area.

Further                  Nico van Eijk and others, Hosting Intermediary Services and Illegal Content Online: An Analysis of the Scope of Article 14 ECD in Light of Developments in the Online Service Landscape: Final Report. (European Commission 2019).

Further                  Colten Meisner, ‘The Weaponization of Platform Governance: Mass Reporting and Algorithmic Punishments in the Creator Economy’ [2023] Policy & Internet. (UCL 🔒)

Further                  Philipp Hacker, Andreas Engel and Marco Mauer, ‘Regulating ChatGPT and Other Large Generative AI Models’, Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (Association for Computing Machinery 2023).

Do AI systems benefit from intermediary shielding under the Digital Services Act? In section 5, Hacker and colleagues argue they do not.

Statute

European Union

Compulsory         (“DSA”) Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1, arts 4, 5, 6, 8.

Further                  (Parts repealed by the DSA) Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) OJ L 178/1, recitals 40-49, arts 1, 12–15.

The Digital Services Act (DSA) replaces the relevant parts of the e-Commerce Directive from February 2024. ECD, arts 12–15 map to DSA arts 4, 5, 6, and 8. It is fine to already refer to the DSA in essays rather than the ECD without outlining this transition, but you’ll need to be aware of the ECD to refer to relevant case-law.

 

🇬🇧 Brexit Note 🇪🇺: The Digital Services Act (DSA) replaces the relevant parts of the e-Commerce Directive from February 2024. ECD, arts 12–15 map to DSA arts 4, 5, 6, and 8. The DSA does not apply in the UK, which will still use the transposition of the ECD (The Electronic Commerce (EC Directive) Regulations 2002). These regulations do not explicitly contain art 15 of the ECD, the general monitoring prohibition, as the UK relied on the vertical direct effect of the ECD to apply that provision (i.e. to prevent Parliament adopting contrary laws, and prevent emanations of the state from applying general monitoring obligations), and that effect has lapsed.

United States

US law in this area centres on ‘Section 230’. This part of US law has its own mythology and saga-filled history. If interested, you can follow it in Jeff Kosseff, The Twenty-Six Words That Created the Internet (Cornell University Press 2019) (UCL 🔒), and for an alternative, more critical view on this provision, see Mary Anne Franks, ‘The Cult of the Internet’, The Cult of the Constitution (Stanford University Press 2019) (UCL 🔒).

Required       47 U.S.C. 230(c) (‘Section 230’).

This is the broad-ranging liability shield in US law which does not have a notice and takedown requirement. Often referred to, technically incorrectly, as ‘Section 230 of the Communications Decency Act’, which amended that section into existence. This is such a common error that it is now more a shorthand, and even appears to be cited that way by the US Supreme Court (see Justice Alito in Moody v NetChoice, 603 U.S. 707 (2024) at 7), so feel free to refer to it that way.

Case Law

European Union

Compulsory         Case C-682/18 YouTube and Cyando ECLI:EU:C:2021:503 paras 18-39, 103-118.

A case concerning whether certain platform features give an “active role” to intermediaries.

Required       Case C70/10 Scarlet Extended ECLI:EU:C:2011:771

On attempted general monitoring obligations applied to mere conduits (a Belgian ISP).

Further                  Case C-360/10 Netlog ECLI:EU:C:2012:85

Same issue as Scarlet, delivered on the same day, relates to hosting on Netlog, a now-defunct Belgian social network.

Required       Joined Cases C-236/08 and C-238/08 Google France ECLI:EU:C:2010:159

Focus on paras 22-32 and 106-120, you don’t need to understand the relevant trademark law.

Required       Case C324/09 L’Oréal SA v eBay ECLI:EU:C:2011:474

Consider what being a ‘dilligent economic operator’ means in this judgment.

Further                  Case C-314/2 Telekabel ECLI:EU:C:2014:192 paras 26-50, 106-114

Further                  Case C-484/14 Mc Fadden ECLI:EU:C:2016:689

 

💡 Reading the Advocate General opinions for these cases can elaborate on how they link to previous case law, and they are written in a less robotic way than the CJEU (although remember that the Court does not always follow the reasoning of the AG.) You can access these on the CJEU’s own webpage using the Curia search by entering the case number.

European Court of Human Rights

The case-law of the ECHR on intermediary liability has at time seemed at odds with the CJEU. The case of Delfi v Estonia received criticism for seeming to ignore EU law. While the ECtHR seemed to draw some boundaries around the Delfi case, these seem to have been radically loosened in the more recent 2023 Grand Chamber judgment in Sanchez v France, which did not find a violation of freedom of expression in a case where the French state held a politician criminally liable for not removing hateful comments under his post on a friends-only Facebook page, despite not being explicitly notified of these posts by those taking the action.

Required       Sanchez v France ECLI:CE:ECHR:2023:0515JUD004558115

See analysis of the potential effect of this judgment in Jacob van de Kerkhof, ‘Sanchez v France: The Expansion of Intermediary Liability in the Context of Online Hate Speech’ (Strasbourg Observers, 17 July 2023).

Required       Delfi AS v Estonia ECLI:CE:ECHR:2015:0616JUD006456909

Can this case be reconciled with EU law? Arguments that it can be, such as Aleksandra Kuczerawy and Pieter-Jan Ombelet, ‘Not so Different after All? Reconciling Delfi vs. Estonia with EU Rules on Intermediary Liability’ (Media@LSE, 1 July 2015), seem to rely on an opposite finding the CJEU in Youtube and Cyando, and now confirmed in the recitals of the DSA, that knowledge of a violation needs to be specific, not just general.

Further                  MTE v Hungary ECLI:CE:ECHR:2016:0202JUD002294713

Further                  Pihl v Sweden ECLI:CE:ECHR:2017:0207DEC007474214

Both MTE v Hungary and Pihl v Sweden are cases where the ECtHR attempts to distinguish from Delfi on the basis of the extremity of the speech in question — something quite different from the DSA’s intermediary liability regime, which provides shielding without considering the nature of the content.

 

💡 The ECtHR maintains a list of decided and pending cases relating to freedom of expression and hate speech online within their Press Unit’s Hate Speech factsheet (under ‘Online Hate Speech’), although it is not always fully updated.

United Kingdom

The UK and E&W case law on intermediary liability is of less central interest to this module. Much of it relates to defamation. You can read more about the trajectory of this case law, including most of the cases below, in Jaani Riordan, ‘Defamation’ in Jaani Riordan (ed), The Liability of Internet Intermediaries (Oxford University Press 2016) (UCL 🔒.)

Further                  Godfrey v Demon Internet Ltd [1999] EWHC QB 244.

One, if not the first defamation cases on the Internet in England and Wales — primarily of historical interest today.

Further                  Payam Tamiz v Google Inc [2013] EWCA Civ 68

Establishes at common law that an intermediary such as Google providing Blogger.com would not be a publisher of comments made by a user of that platform (further shielding that may apply were the defence in the e-Commerce Directive to be removed).

Further                  Cartier International AG & Ors v British Telecommunications Plc & Anor [2018] UKSC 28

This case is largely of tangential interest, but to date is the only time the UK Supreme Court has been asked to consider a question relating explicitly to the liability of internet intermediaries.


 

T1: D’oH! Code, Law and Politics of Encrypted DNS

Professor Orla Lynskey and Dr Bernard Keenan

The domain name system is a core part of the way that the Internet works. It effectively allows us to browse content without having to remember addresses that were designed only for computers, in particular IP addresses *that look like 43.157.242.47) — however it is also been described as the “Achilles heel” of the Internet, because it has been a serious ground of contestation around issues such as privacy and Internet blocking. This is effectively because if you can control the address book of the Internet, you make resources a lot harder to discover. Most recently, a series of proposals have been made which significantly changed the way that this system works – in particular by encrypting it. These services are typically called DNS over HTTPS — or ‘DoH’, because HTTPS is the encrypted way of delivering services on the Web (the ‘S’ is for Secure). These interact and interfere with a range of existing mechanisms to block content, and so have been controversial. But who is responsible for this change? The standard setters? The Web browsers? The internet service providers? Legislators? In this tutorial, we will use DNS-over-HTTPS as an example to answer these questions.

💡 This topic is technical but we will approach it as law scholars. You are not expected to know the technical details of DNS, but you are expected to understand the broad way it works, and the types of changes that encrypting it will lead to. Try and avoid getting in the weeds and continue to ask the question of ‘so what?’ when you read more technical work, thinking of the legal and policy consequences you might imagine. The ‘so what?’ question should guide your reading — what questions do you really need answers to that are technical, and which can you abstract away and ignore?

Tutorial Questions

Make notes on these topics to bring to the tutorial to discuss.

·       What is DNS? What is important to know about DNS in relation to law and policy? How is DNS used in Internet blocking?

·       What kind of body is the Internet Watch Foundation? Is it a regulator?

·       Who decides whether encrypted DNS systems, such as DNS-over-HTTPS (DoH) are introduced?

·       What does this mean for the power of private firms vis-a-vis the state?

·       The UK’s Internet blocking regime is heavily built on DNS filtering. Should this be considered when encrypted DNS is introduced? Who should consider it, and how? How might this work internationally?

·       Many countries have recently been talking about digital sovereignty (also souveraineté numérique in France). Do you think that DNS illustrates a loss, or highlights a lack, of nation states’ digital sovereignty?

Readings

Compulsory         Video 📹: Mike Pound, How DNS Works (Computerphile, University of Nottingham, 9 July 2020).

Compulsory         Debate in Hansard on “Internet Encryption” (HL Deb 14 May 2019, vol 797, cols 1492–1495)

A good opportunity to see one of the strangest legislative chambers in the world talk about a highly technical issue.

Compulsory         Open Rights Group, ‘DNS Security — Getting it Right’ (24 June 2019)

The Open Rights Group (ORG)is a UK-based digital rights NGO. This report outlines many of the technical aspects of DNS, and the consequences of this, written from the standpoint of a digital rights advocacy organisation.

Compulsory         Internet Watch Foundation, ‘Briefing on DNS-over-HTTPS

A briefing that in many ways poses a counterpoint to the Open Rights Group paper.

Required       EU Innovation Hub for Internal Security, ‘First Report on Encryption’ (2024) pages 33-37 only.

This short section is good at distinguishing the different types of encrypted DNS: some still give law enforcement the ability to understand a little of what is going on, others blur DNS with normal HTTPS data (the actual content of your website) so they can’t work out much. You do not need to look at too many of the technical details; stay focussed on the broader consequences.

Required       Internet Society, ‘Encrypted DNS Factsheet’ (May 2023)

A complement to the ORG paper above, shorter. Contains diagrams around DNS and a shorter description of some of the different versions of encrypted DNS (the distinctions between which are not necessary to focus on).


 

L4: Intermediaries II: Moderation, Algorithms and General Monitoring

Dr Michael Veale

Intermediary liability shielding, as established in the previous session, immunises certain actors against liability in certain situations, sometimes conditional on them doing or not doing a certain thing. They often have no obligation to look for content actively, which would, potentially, remove their immunity. In order to make this meaningful, EU law contains a prohibition on imposing ‘general monitoring’ obligations. In this session, we will consider what this means, and how far it might really apply today, in a complex Internet landscape of intermediaries that do (and can do) more than just host whatever content they are told.

Learning Objectives

·       Why might a prohibition on general monitoring obligations be justified? What kind of issues it is trying to prevent?

·       Has the CJEU changed its understanding of general monitoring over time? What are the arguments for and against this kind of change?

·       Is it easy to search for illegal content now we have more advanced technologies to help us do it? What can go wrong? Is it just a matter of time before they become good enough?

Articles

Compulsory         Joris Van Hoboken and Daphne Keller, ‘Design Principles for Intermediary Liability Laws’ (Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression Working Paper, 8 October 2019)

This guide looks at the different configurations and components commonly found in international laws and proposals around intermediary liability.

Compulsory         Robert Gorwa, Reuben Binns and Christian Katzenbach, ‘Algorithmic Content Moderation: Technical and Political Challenges in the Automation of Platform Governance’ (2020) 7 Big Data & Society 2053951719897945. OA link

This paper critically considers the limits of automation of content moderation policies.

Required       João Pedro Quintais, Naomi Appelman and Ronan Ó Fathaigh, ‘Using Terms and Conditions to Apply Fundamental Rights to Content Moderation’ (2023) 24 German Law Journal 881.

Required       Daphne Keller, ‘Facebook Filters, Fundamental Rights, and the CJEU’s Glawischnig-Piesczek Ruling’ (2020) 69 GRUR Int 616. paywall link / UCL link

This paper is critical of the Glawischnig-Piesczek case, arguing that many parts of it might end up in overreach.

Required       Christina Angelopoulos and Martin Senftleben, ‘An Endless Odyssey? Content Moderation Without General Content Monitoring Obligations’ (SSRN Working Paper, 2021).

Required       Aleksandra Kuczerawy, ‘General Monitoring Obligations: A New Cornerstone of Internet Regulation in the EU?’ in Rethinking IT and IP Law - Celebrating 30 years CiTiP (Intersentia 2019).

Required       Giovanni Sartor, ‘The Impact of Algorithms for Online Content Filtering or Moderation: Upload Filters’ (European Parliament 2020).

Further                  Maayan Perel and Niva Elkin-Koren, ‘Accountability in Algorithmic Copyright Enforcement’ (2015) 19 Stan Tech L Rev 473.

Further                  Graham Smith, ‘5.6 Liability of Online Intermediaries’ in Internet Law and Regulation (Sweet and Maxwell 2020) Book available on Westlaw UK

Further                  Giancarlo Frosio (ed.) Oxford Handbook of Online Intermediary Liability (Oxford University Press 2020) Closed access DOI UCL link

This is a useful tome for further reading on intermediary liability, in particular comparing different regimes beyond the UK, Europe or the United States.

Case Law

European Union

Compulsory         Case C-18/18 Glawischnig-Piesczek v Facebook ECLI:EU:C:2019:821

A defamation case which looks at what kind of obligations can be placed on an intermediary to take down content that is similar to that which is subject of the initial complaint. Look carefully at the wording in the judgment, and consider its link to the availability of search tools, particularly in light of readings above such as Keller (2020) as well as Gorwa et al. (2020).

Required       Case C70/10 Scarlet Extended ECLI:EU:C:2011:771 (relates to mere conduits) or Case C-360/10 Netlog ECLI:EU:C:2012:85 (same issue as Scarlet, relates to hosting on Netlog, a now-defunct Belgian social network)

You do not need to read both of these as you will see they are very similar. Scarlet is probably marginally more important to read.

Further                  Case C-314/2 Telekabel ECLI:EU:C:2014:192 paras 26-50, 106-114

Further                  Case C-484/14 Tobias Mc Fadden v Sony Music Entertainment Germany GmbH ECLI:EU:C:2016:689.

This case concerned a WiFi network in a shop that was being used to break the law. A court asked the CJEU whether an injunction against the shop was legal according to the e-Commerce Directive if it required scanning of all content, termination of the WiFi network, or password protecting the network. The CJEU stated that out of the three, only the latter was allowed, as long as it was effective and required users to reveal their identity in order to get the password.

Further                  Case C-401/19 Poland v Parliament ECLI:EU:C:2022:297

L5: Intermediaries III: Platform Regulation and Regulating-by-Design

Dr Michael Veale

Learning Objectives

·       Is it useful to try and strictly define platforms? If we do, should we define them by what they are, by what they do, or by something else?

·       What challenges do platforms pose that we might want to regulate? Which do you think are the most pressing or important?

·       Why is it hard to regulate platforms?

·       What are the some of the current regulatory trends to try and regulate this area? Do you think they will work — and if so, for whom?

·       What are the opportunities and risks of requiring platforms to act in a more ‘proactive’ manner? What are the implications for the platform business model, and for the political economy of the Internet?

Articles

Compulsory         Rachel Griffin, ‘The Law and Political Economy of Online Visibility: Market Justice in the Digital Services Act’ (2023) [2023] Technology and Regulation 69.

Compulsory         Martin Husovec, ‘Will the DSA Work? On Money and Effort’ in Joris Van Hoboken and others (eds) Putting the DSA into Practice: Enforcement, Access to Justice, and Global Implications (Verfassungsbooks 2023).

This short piece takes a more practical look at the enforcement of the DSA, and the promise (and pitfalls) of some of its provisions.

Required       Miriam C Buiten, ‘The Digital Services Act: From Intermediary Liability to Platform Regulation’ (2022) 12 Journal of Intellectual Property, Information Technology and E-Commerce Law.

This paper outlines some of the features of the DSA, and how it builds on, changes and expands existing regimes.

Required       Julie E Cohen, ‘Law for the Platform Economy’ (2017–18) 51 UCD L Rev 133.

This piece lays out theory and practice of platforms in the legal order, arguing that platforms construct their power out of law, rather than exist in a lawless space, and considering some of the tactics and approaches they use the be slippery and difficult to govern

Required       María P. Angel and danah boyd, ‘Techno-legal Solutionism: Regulating Children’s Online Safety in the United States’ (2024) CSLAW’24: 3rd ACM Computer Science and Law Symposium.

Critiques how much a ‘safety-by-design’ approach can deal with the areas it targets, or whether it takes a dangerously techno-solutionist approach to issues such as mental health which resist such fixes.

Required       Graham Smith, ‘Take Care with That Social Media Duty of Care’ (Cyberleagle, 19 October 2018).

Further                  Robert Gorwa, ‘What is Platform Governance?’ (2019) 22 Information, Communication & Society 854.

Further                  Lorna Woods and Will Perrin, ‘Obliging Platforms to Accept a Duty of Care’ in Martin Moore and Damian Tambini (eds), Regulating Big Tech (Oxford University Press 2021) (🔒 UCL)

Further                  Christoph Busch, ‘Regulating the Expanding Content Moderation Universe: A European Perspective on Infrastructure Moderation’ (2022) 27 UCLA Journal of Law and Technology 32.

Statute

European Union

Compulsory         Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1,

Ensure to look at least at: arts 14(4) (Terms and conditions), 22 (Trusted flaggers), 25 (Online interface design and organisation), 27 (Recommender system transparency), 28 (Online protection of minors), 33 (Very large online platforms and very large online search engines), 34 (Risk assessment), 35 (Mitigation of risks), 37 (Independent audit), 38 (Recommender systems), 39 (Additional online advertising transparency), 40 (Data access and scrutiny), 44 (Standards), 74 (Fines).

United Kingdom

Compulsory         Online Safety Act 2023, part 2, part 3 ch 2, 4, 7, part 7 ch 5.

Singapore

Further                  Broadcasting Act 1994, part 10A, division 3.

Further                  Info-communications Media Development Authority (IMDA), ‘Broadcasting Act 1994: Code of Practice for Online Safety’ (17 July 2023).

This is the code of practice referred to in the Broadcasting Act 1994 (Singapore) s 45L.It currently applies to the entities listed by IMDA in this document.

Australia

Further                  Online Safety Act 2021, part 4.

In relation to proactive duties, while the act does function with codes of conduct, similar to Singapore, social media firms only have reporting obligations on how they are meeting these duties; the eSafety Commissioner does not have substantive powers unless these reporting duties are not met.


 

T2: Regulating Recommending

Professor Orla Lynskey and Dr Bernard Keenan

In this tutorial, we are going to think about recommender systems, as well as some of the approaches proposed to regulate them. Look at two articles, and the provisions in the Digital Services Act, and review your readings from previous sessions, to think about the questions below:

Tutorial Questions

·       Are algorithmic systems essential to deliver content online today? Is there a world without recommenders, or one where they are neutral? In what interfaces or media are they most important?

·       What kind of logics might be built into recommender systems? What kind of information about the content, or the users, might you need to do this, and where does this information come from?

·       Are recommender systems a good target for regulation of online content? What are the potential benefits, and what are the risks?

·       How might you actually regulate recommender systems? Do you think the approaches proposed in, for example, the Digital Services Act, will work?

·       How is being ‘shadow banned’ (or having content ‘reduced’) from a recommender similar or different to being removed from a hosting platform?

Articles

Compulsory         Jennifer Cobbe and Jatinder Singh, ‘Regulating Recommending: Motivations, Considerations, and Principles’ (2019) 10(3) European Journal of Law and Technology.

This article predated the case in Youtube and Cyando and so please read the discussions of recommenders and European intermediary liability law in the context of that case.

Compulsory         Laurens Naudts and others, ‘Toward Constructive Optimisation: A New Perspective on the Regulation of Recommender Systems and the Rights of Users and Society’ in Natali Helberger and others (eds), Digital Fairness for Consumers (European Consumer Organisation (BEUC) 2024).

Required       Ulrik Lyngs and others, ‘So, Tell Me What Users Want, What They Really, Really Want!’ in Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (CHI EA ’18, New York, NY, USA, ACM 2018).

Required       Christoph Busch, ‘From Algorithmic Transparency to Algorithmic Choice: European Perspectives on Recommender Systems and Platform Regulation’ in Sergio Genovesi, Katharina Kaesling and Scott Robbins (eds), Recommender Systems: Legal and Ethical Issues (Springer 2023).

Required       Arvind Narayanan, ‘Understanding Social Media Recommendation Algorithms’ (Knight First Amendment Institute at Columbia University, 9 March 2023).

Required       Paddy Leerssen, ‘The Soap Box as a Black Box: Regulating Transparency in Social Media Recommender Systems’ (2020) 11(2) European Journal of Law and Technology.

Required       Tarleton Gillespie, ‘Do Not Recommend? Reduction as a Form of Content Moderation’ (2022) 8 Social Media + Society 20563051221117550.

Required       Axel Bruns, ‘Filter Bubble’ (2019) 8 Internet Policy Review.

Required       Daphne Keller, ‘Amplification and Its Discontents: Why Regulating the Reach of Online Content Is Hard’ (2021) 1 Journal of Free Speech Law 227.

Further                  Nick Seaver, ‘Captivating Algorithms: Recommender Systems as Traps’ (2019) 24 Journal of Material Culture 421.

Further                  Natali Helberger and others, ‘Exposure Diversity as a Design Principle for Recommender Systems’ (2018) 21 Information, Communication & Society 191.

Cases

European Union

Required       Case C-682/18 YouTube and Cyando ECLI:EU:C:2021:503 paras 18-39, 103-118.

You should have read this in the first intermediary liability session, so consult your notes from then.

Statute

European Union

Compulsory         Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1, arts 27, 38

Further                  Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services OJ L 186/57 (‘P2B Regulation’), art 5.

United States

These are provided not because it is crucial to understand US law in this class, but instead because they present policy options for regulating recommenders that at times contrast with those proposed in e.g. the EU.

Further                  (Draft Bill) Protecting Americans From Dangerous Algorithms Act, H.R. 8636, 116th Cong. (2020)

Further                  (Draft Bill) Filter Bubble Transparency Act, H.R. 5921 117th Cong. (2021)

 


 

L6: Data Protection: Form, Function and Reach

Professor Orla Lynskey

In the first part of this session, we introduce a parallel but interwoven regime to privacy and private life: data protection. Lynskey introduces where data protection came from, what is does, and how it relates to privacy. Her book was written before the passing of the General Data Protection Regulation, which builds on previous privacy statutes; this is the topic of Hoofnagle and others, who seek to summarise the Regulation. You should also read the GDPR alongside this article as appropriate.

Think while reading about what data protection seeks to do and protect. How does it secure the aims we discussed when considering privacy? What other ends does it pursue, or ways might it protect or empower people? Is it a subset of privacy, or a separate, complementary regime? How many of the rights and obligations were you aware of, and how does the text of data protection law relate to the practices of firms and governments that you are aware of?

We then turn to the reach of the law: when does it apply? One of the main concepts in data protection is the concept of personal data. The boundaries of this concept have been hotly contested, and are an important point of interaction for law, policy, computer and data science alike; all of these disciplines work on this issue intensely. McAuley describes why computer scientists find anonymisation difficult to achieve. Purtova argues that the CJEU has interpreted the GDPR in an expansive manner which has led an unmanageable array of information types to be classifiable as personal data (but compare to optional reading, Dalla Corte, who critiques this reading). Elliot and others propose a different approach, which looks at the risk of data to be reidentified in its environment. What would be the benefits or risks of adopting this approach? You should also read the Breyer case, which is looked at considerably in the Purtova article.

 

Articles

Compulsory         Orla Lynskey, ‘The Key Characteristics of the EU Data Protection Regime’ and ‘The Link between Data Protection and Privacy in the EU Legal Order’ in The Foundations of EU Data Protection Law (Oxford University Press 2015) UCL link

Compulsory         Chris Jay Hoofnagle and others, ‘The European Union General Data Protection Regulation: What It Is and What It Means’ (2019) 28 Information & Communications Technology Law 65 OA link

Compulsory         Nadezhda Purtova, ‘The Law of Everything. Broad Concept of Personal Data and Future of EU Data Protection Law’ (2018) 10 Law, Innovation and Technology 40.

Required       Gloria González Fuster, ‘The Materialisation of Data Protection in International Instruments’ in The Emergence of Personal Data Protection as a Fundamental Right of the EU (Springer 2014) UCL link

Required       Lorenzo Dalla Corte, ‘Scoping Personal Data: Towards a Nuanced Interpretation of the Material Scope of EU Data Protection Law’ (2019) 10 European Journal of Law and Technology.

         Critiques Purtova (2018), arguing that the scope she indicates is too broad and highlighting rhetorical flaws in the argument.

Further                  Jef Ausloos, ‘Foundations of Data Protection Law’ in The Right to Erasure in EU Data Protection Law: From Individual Rights to Effective Protection (Oxford University Press 2020) UCL link

Further                  Orla Lynskey, The Foundations of EU Data Protection Law (Oxford University Press 2015) UCL link

Further                  Mark Elliot and others, ‘Functional Anonymisation: Personal Data and the Data Environment’ (2018) 34 Computer Law & Security Review 204. OA preprint link / UCL typeset link

Further                  Michèle Finck and Frank Pallas, ‘They Who Must Not Be Identified—Distinguishing Personal from Non-Personal Data under the GDPR’ (2020) 10 International Data Privacy Law 11.

Contains useful context about pseudonymisation methods such as hashing and salted hashing, as well as detailed legal analysis up to 2020.

Further                  Benjamin Wong, ‘Delimiting the Concept of Personal Data after the GDPR’ (2019) 39 Legal Studies 517. UCL link

Further                  Michael Veale, Reuben Binns and Lilian Edwards, ‘Algorithms that Remember: Model Inversion Attacks and Data Protection Law’ (2018) 376 Phil Trans R Soc A 20180083.

A paper arguing that machine learned models may contain personal data. Relevant for discussions of chatGPT and similar large language models.

Cases

European Union

Compulsory         Case C-582/14 Patrick Breyer v Bundesrepublik Deutschland ECLI:EU:C:2016:779

On the identifiability of personal data

Required       Case C434/16 Nowak ECLI:EU:C:2017:994

On exam scripts and comments, and whether opinions are or are not personal data.

Required       C-184/20 Vyriausioji tarnybinės etikos komisija ECLI:EU:C:2022:601

On conditions for inference of special category data.

Required       Case C-101/01 Lindqvist EU:C:2003:596

On the scope of the household exemption,

Required       Case C212/13 Ryneš ECLI:EU:C:2014:2428.

On a CCTV camera on a house and the household exemption.

Further                  Case C-479/22 P OC v European Commission ECLI:EU:C:2024:215

Further                  Case C-604/22 IAB Europe ECLI:EU:C:2024:214

On the information that a cookie banner sends about the consent options selected.

Further                  Case C-659/22 Ministerstvo zdravotnictví (Application mobile Covid-19) ECLI:EU:C:2023:745

On the scanning of a vaccination certificate being personal data processing.

Further                  Case C446/21 Schrems (Communication de données au grand public) ECLI:EU:C:2024:834

 

Further                   

Videos

Compulsory         Video: Derek McAuley, The Anonymisation Problem (Computerphile 2017) length: 10 mins

 

Statute

European Union

Compulsory         GDPR, recitals 26-30, arts 2, 4(1).

Try to understand the household exemption, and the scope of personal data and processing.

Compulsory         Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L 119/1. Read this alongside the Hoofnagle and others article. You do not need to read all of the recitals yet, and this takes most of the length out; but they do help shine light on the main articles, and are important interpretative tools, so refer back as appropriate.]

Compulsory         Charter of Fundamental Rights of the European Union, articles 7–8.

 

💡🇬🇧 Brexit Note 🇪🇺: The GDPR applies in the UK as the so-called ‘UK GDPR’. This is effectively the EU regulation, as of the end of 2020, as retained through the European Union (Withdrawal) Act 2018, and as amended by several instruments since. Currently, these amendments are light, but there may heavier ones forthcoming. The Sunak Government proposed but did not pass the Data Protection and Digital Information Bill, and the Starmer Government may propose a ‘Smart Data Bill’ that alters the UK GDPR. The core parts of the regime are the same, and the EU case-law from before the end of 2020 still applies in the UK. I will be explicit in the case of divergence. However, it is fine to refer to EU cases from after 2020, just be careful you do not make statements about them binding the UK. Some people (including people who should really know better) make the error of saying that the equivalent of the GDPR in the UK is the Data Protection Act 2018 — this is false. The Data Protection Act 2018 sits alongside the UK GDPR, tailors its institutions, adds national exemptions and restrictions, and transposes some other instruments such as the Law Enforcement Directive (data protection for policing) and Convention 108+ to apply to intelligence services, albeit weakly and with mostly carveouts.

L7: Data Protection’s Substance: Cutting to the Core

Professor Orla Lynskey

 

The broad scope of data protection law is tempered to some extent by the law’s legitimising force. One within the scope of the law, data processing is legitimate if it, first, has a legal basis and, second, complies with the principles of personal data processing (sometimes known as Fair Information Practice Principles or FIPPs). The most famous of the legal basis is consent (as interpreted by the CJEU in Planet49). However it is a common misconception that consent is required for personal data processing as data controllers can also rely on other legal bases to justify data processing including legitimate interests and the public interest. Moreover, even where data processing has a legal basis, it must comply with all the data processing principles, which include data security, fairness and proportionality-based principles. In practice, these principles can have a significant disciplining effect on data processing. In this seminar, we will debate whether data protection law gets the balance between protecting fundamental rights and legitimising data processing.

Articles

Compulsory             Jef Ausloos, ‘Foundations of Data Protection Law’ in The Right to Erasure in EU Data Protection Law: From Individual Rights to Effective Protection (Oxford University Press 2020) UCL link

An effective overview of the origins and essential features of the EU data protection framework.

Further                   Karen Yeung and Lee A. Bygrave, ‘Demystifying the modernized European data protection regime: Cross-disciplinary insights from legal and regulatory governance scholarship’ (2022) 16 Regulation & Governance 137.

Analyses the changes brought about by the GDPR from the perspective of regulation.

Further                  Jef Ausloos and Damian Clifford ‘Data Protection and the Role of Fairness’ (2018) 37 Yearbook of European Law 130-187.

Proposes a distinction between procedural and substantive fairness understandings of fairness in data protection law.

Further                  Reuben Binns, ‘Data protection impact assessments: a meta-regulatory approach’  (2017) 7 International Data Privacy Law 22.

Further                  Orla Lynskey, ‘The Key Characteristics of the EU Data Protection Regime’ and ‘The Link between Data Protection and Privacy in the EU Legal Order’ in The Foundations of EU Data Protection Law (Oxford University Press 2015) UCL link

Further                  Gloria González Fuster, ‘The Materialisation of Data Protection in International Instruments’ in The Emergence of Personal Data Protection as a Fundamental Right of the EU (Springer 2014) UCL link

Cases

European Union

 

Compulsory  Case C-252/21, Meta Platforms and Others ECLI:EU:C:2023:537

On the appropriate legal basis for online profiling to subsidise the provision of ‘free’ online services. Important for consent, legitimate interests and contract.

Compulsory         Case C-673/17, Planet49 ECLI:EU:C:2019:801

Interprets the meaning of consent under the GDPR

Further            Case C-13/16, Rigas Satiksme EU:C:2017:336

On the meaning and application of legitimate interests balancing test.

 

Statute

Compulsory         GDPR, arts 5-9

 

Policy Documents

Required       European Data Protection Board (EDPB) Opinion 08/2024 on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms, OA link

Opinion of the EDPB on ‘pay or okay’ and the application of the Meta Platforms judgment in practice

Further           European Court of Human Rights, Guide to the Case Law of the European Court of Human Rights: Data Protection (Council of Europe, updated regularly) OA link

 

 

T3: Data Rights and Wrongs

Preparation (in class, in part)

For this tutorial, there is a short preparatory activity we will use as a basis for discussion about the right of access. Carrying out steps 1-2 are essential, but step 3 is optional (writing further to a company to request your data). I advise doing it, however, as it is free, and interesting to see how they interact with you, and what you get back.

1.     Try to use a “download my data” tool” on one or more services that you use. Some examples of those from the largest services are below:

·       Facebook

·       Twitter

·       Instagram

·       Amazon

·       Apple

·       Google

·       Microsoft

·       TikTok

  1. If you get a copy of the data, try to locate the privacy policy for that company. Privacy policies are usually implemented as to provide some of the information required in Article 13 of the GDPR (have a look). Re-read articles 13–15, and article 20, of the GDPR. Is this all the data you requested? What personal data might be missing? Do you also think (e.g. from your usage of a service) that the controller might have additional data about you they are not telling you about?
  2. [Optional, but recommended] Write an email or other message to the firm (how should be detailed in the privacy policy) asking for a full copy of you data, highlighting any omissions you discovered or suspected based on point 2. I have made a template for a very full version of how to do this here, but you are welcome to just type something shorter or more focussed.

If you get to 2 and 3, save all the files you get from this process in one place (e.g. a copy of relevant parts of the privacy policy, and any data), so we can talk about the process during the tutorial. Regardless, save and look at any data you receive.

In this session, we’ll discuss the access rights made earlier in class. Please come having done the reading and try to bring whatever you have received in a form on your computer you can use to discuss and refer to (although no need to share it!).

Questions to consider: - What is the right of access for? What is its scope? What are its limits? - How powerful is the right of access at achieving its various purposes? - What barriers exist to make access rights useful or powerful? - What barriers did you face getting access to, or scrutinising data? How might you reform the right of access to make it more useful - or is such reform futile?

Readings

Compulsory         GDPR, arts 12, 15, 20.

Compulsory         European Data Protection Board, ‘Guidelines 01/2022 on data subject rights - Right of access’ (EDPB 2023).

Required       Jef Ausloos and Michael Veale, ‘Researching with Data Rights’ (2020) 2020 Technology and Regulation 136.

Further                  Pierre Dewitte and Jef Ausloos, ‘Chronicling GDPR Transparency Rights in Practice: The Good, the Bad and the Challenges Ahead’ (2024) 14(12) International Data Privacy Law 106.

Further                  Jef Ausloos and Pierre Dewitte, ‘Shattering One-Way Mirrors – Data Subject Access Rights in Practice’ (2018) 8 International Data Privacy Law 4.

Further                  René Mahieu, ‘The Right of Access to Personal Data: A Genealogy’ (2021) 2021 TechReg 62.

Explains a lot about the historical purpose and political vision of the right of access from one of those important in shaping it, Italian scholar and jurist Stefano Rodotà.

Further                  Case C434/16 Nowak ECLI:EU:C:2017:994

On the broad scope of the right of access.

Further                  Case C154/21 RW v Österreichische Post AG ECLI:EU:C:2023:3

On the right to know ‘recipients’ of personal data, and that requests have to be tailored to the individual if required, not generic. See also in English law Harrison v Cameron & Anor [2024] EWHC 1377 (KB).

Further                  Joined Cases C141/12 and C372/12 YS and Others ECLI:EU:C:2014:2081.

On the distinction between data and documents.

Further                  Case C487/21 F.F. v Österreichische Datenschutzbehörde and CRIF ECLI:EU:C:2023:369

Clarifying YS and Others that individuls have a right to database extracts or documents if the provision of this copy is necessary to exercise rights.

Further                  Dawson-Damer & Ors v Taylor Wessing LLP [2017] EWCA Civ 74 at [104]–[108]; B v The General Medical Council [2019] EWCA Civ 1497 at [79]

On ‘purpose-blind’ access rights - ‘the general position is that the rights of subject access to personal data […] are not dependent on appropriate motivation on the part of the requester’

Further                  Case-307/22 FT (Copies du dossier médical) ECLI:EU:C:2023:811

Purpose-blind nature under post-Brexit EU case law - ‘the controller is under an obligation to provide the data subject, free of charge, with a first copy of his or her personal data [..] even where the reason for that request is not related to those referred to in the first sentence of recital 63 of that regulation’.

L8: Being Forgotten Online

Professor Orla Lynskey

 

Learning Objectives

·       How does the right to be forgotten function? Where did it come from?

·       What kind of information and process is needed to assess a RTBF request according to the existing jurisprudence? What kind of a process might this be, and how does that compare to what we know of the processes that are used in practice?

·       Who does the RTBF benefit?

·       Is the balance correct in the RTBF? Should there be more or less emphasis on freedom of expression?

·       Is the RTBF adequately governed? How do private decisions and judicial decisions interplay, and are there potential reforms that you might want to see implemented at this interface?

Articles

Compulsory         Andrés Guadamuz, ‘Developing a Right to Be Forgotten’ in Tatiana-Eleni Synodinou and others (eds), EU Internet Law: Regulation and Enforcement (Springer 2017). OA-ish link UCL paywall link

An overview of some of the debates and sides people take concerning the Right to Be Forgotten.

Compulsory         Paul De Hert and Vagelis Papakonstantinou, ‘Right to Be Forgotten’, Elgar Encyclopedia of Law and Data Science (2022).

A concise encyclopaedia entry providing a guide to the RTBF’s origin and jurisprudence up through the start of 2022.

Further                  Aleksandra Kuczerawy and Jef Ausloos, ‘From Notice-and-Takedown to Notice-and-Delist: Implementing Google Spain’ (2015–16) 14 Colo Tech LJ 219. OA link

         A discussion of some of the roles (as they were and are emerging) of different governance actors in the run up to, and in the wake of, the judgment in Google Spain.

Further                  Theo Bertram and others, ‘Five Years of the Right to Be Forgotten’ in (ACM 2019) Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security 959. OA link

         A scholarly article on how the RTBF has panned out from the point of view of Google employees, who authored it. Read alongside the firm’s regularly updated, quantitative Transparency Report into EU delisting on the basis of the right.

Further                  Joris Van Hoboken, ‘Search Engine Freedom’ in Search Engine Freedom: On the Implications of the Right to Freedom of Expression for the Legal Governance of Web Search Engines (University of Amsterdam 2012) pp 168-213. OA link

A discussion of the theoretical and legal manners in which freedom of expression applies to search engines, given their crucial role in enabling access to information. Pre-dates Google Spain, although not the debates about the RTBF.

Further                  Jef Ausloos, The Right to Erasure in EU Data Protection Law: From Individual Rights to Effective Protection (1st edn, Oxford University Press 2020). UCL link 🔒

Further                  Jean-François Blanchette and Deborah G Johnson, ‘Data Retention and the Panoptic Society: The Social Benefits of Forgetfulness’ (2002) 18 The Information Society 33. OA link paywalled, typeset link

A less legal view of why we might want a right to be forgotten from the standpoint of privacy.

Further                  Tarleton Gillespie, ‘To Remove or to Filter?’ in Custodians of the Internet (Yale University Press 2018). UCL link (paywalled)

A broader discussion of the different tactics that internet intermediaries, and particularly platforms, use to limit the distribution of content online.

Further                  David Erdos, ‘The “Right to Be Forgotten” beyond the EU: An Analysis of Wider G20 Regulatory Action and Potential Next Steps’ (2021) 13 Journal of Media Law 1. OA-ish link UCL link

Examines how similar rights to be forgotten work in jurisdiction including Canada, Turkey and Australia.

Further                  Stefan Kulk and Frederik Zuiderveen Borgesius, ‘Privacy, Freedom of Expression, and the Right to Be Forgotten in Europe’ in Evan Selinger and others (eds), The Cambridge Handbook of Consumer Privacy (Cambridge University Press 2018). OA-ish link UCL paywall link

A useful introduction to how freedom of expression is balanced by the CJEU and ECtHR, focussing on the right to be forgotten. Overlaps otherwise in terms of content with Guadamuz (2017).

Policy Documents

Further                  European Data Protection Board, Guidelines 5/2019 on the criteria of the Right to be Forgotten in the search engines cases under the GDPR (EDPB 2020) OA link

Guidance from the European regulators in charge of enforcing Google Spain and the Right to Erasure. See also the commentary on these guidelines by David Erdos, University of Cambridge.

Further                  Access Now, Understanding the Right to Be Forgotten Globally (Access Now 2017) OA link

A short policy paper from an NGO indicating the surrounding conditions and a safeguard wishlist for global implementation of a right to be delisted on privacy grounds.

Statute

European Union

Compulsory         GDPR, art 17.

United Kingdom

Further                  Data Protection Act 2018, sch 2 part 1

This contains exemptions to the right to erasure (among other data rights) that were set down on the basis of GDPR, art 23 (Restrictions).

Case Law

European Union

Compulsory         Case C-131/12 Google Spain SL and Google Inc v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González ECLI:EU:C:2014:317.

Compulsory         Case C136/17 GC and Others v Commission nationale de l’informatique et des libertés (CNIL) ECLI:EU:C:2019:773.

Compulsory         Case C507/17 Google LLC v Commission nationale de l’informatique et des libertés (CNIL) ECLI:EU:C:2019:772.

Further                  Case C‑460/20 Google (Déréférencement d’un contenu prétendument inexact) ECLI:EU:C:2022:962.

 

💡 It may help to read a few case notes on GC and Others and Google v CNIL if you are struggling with them. You can search these out on Westlaw, Google Scholar or just the plain old search engine of your choice.

ECtHR

Further                  Hurbain v Belgium ECLI:CE:ECHR:2023:0704JUD005729216

This case was an Article 10 ECHR claim by the Belgian newspaper Le Soir, which had been ordered to anonymise a digital archived version of a 1994 article about a driving offence. No breach of article 10 was found.

United Kingdom

Required       NT1 & NT2 v Google LLC [2018] EWHC 799 (QB). BAILII

The first Google Spain case applied by English courts. One applicant had his delisting refusal by Google overturned by the High Court, and one had it confirmed. What was the difference between them?

L9: Bidding for Attention: Online Tracking and Data Protection Law

Dr Michael Veale

 

🎼 Somebody is prying through your files, probably
Somebody's hand is in your tin of Netscape magic cookies
But relax: if you're an interesting person
Morally good in your acts
You have nothing to fear from facts

The Age of Information (Momus, 1997)

 

Cookie pop ups when you browse the web are hardly anything new to most Internet users, particularly in Europe. But what exactly are cookies? How do they something that we should be concerned about? How does the law understand cookies, and how has that developed over time? In the session, we’re going to try and get some answers to some of these questions. While reading and examining the below, it will be good to think about the following questions:

1.     What conceptual roles do cookies play in online tracking and profiling?

2.     Consent is a large part of the way that we govern cookies in Europe. But is it a good way to do this? What would the alternatives look like, and in what ways would they be better or worse?

3.     If many of the ways the cookies are used on the Internet are currently illegal, why hasn’t anything been done about it? What are the main challenges and regulatory hurdles that prevent affective enforcement?

4.     What harms do excessive tracking cause? If you’re okay with getting targeted products, is the extent of tracking present online today okay to keep?

Also pay attention to the issue of controllership. Who is the controller when it comes to tracking on a website?

Articles

Compulsory         Michael Veale and Frederik Zuiderveen Borgesius, ‘Adtech and Real-Time Bidding under European Data Protection Law’ (2022) 23(3) German Law Journal 226.

Argues that real-time bidding is difficult if not impossible to reconcile with data protection law.

Compulsory         Catherine Armitage and others, Study on the Impact of Recent Developments in Digital Advertising on Privacy, Publishers and Advertisers: Final Report (Publications Office of the European Union 2023) pages 66-103 only.

Required       Michael Veale, Midas Nouwens and Cristiana Santos, ‘Impossible Asks: Can the Transparency and Consent Framework Ever Authorise Real-Time Bidding After the Belgian DPA Decision?’ (2022) 2022 Technology and Regulation 12.

Further                  Midas Nouwens and others, ‘Dark Patterns after the GDPR: Scraping Consent Pop-Ups and Demonstrating Their Influence’ in (ACM 2020) Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2020).

Further                  René Mahieu and Joris Van Hoboken, ‘Fashion-ID: Introducing a Phase-Oriented Approach to Data Protection?’ (European Law Blog, 30 September 2019) https://europeanlawblog.eu/2019/09/30/fashion-id-introducing-a-phase-oriented-approach-to-data-protection/

Policy Documents

Required       European Data Protection Board, ‘Opinion 08/2024 on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms’ (EDPB 2024).

Further                  Information Commissioner’s Office, ‘Update Report into Adtech and Real Time Bidding’ (20 June 2019).

Further                  Information Commissioner’s Office, ‘Data Protection and Privacy Expectations for Online Advertising Proposals’ (25 November 2021). link

Cases

European Union

Compulsory         Case C-210/16 Wirtschaftsakademie Schleswig-Holstein ECLI:EU:C:2018:388.

On whether the owner of a Facebook fanpage is a joint controller with Facebook.

Compulsory         Case C-49/17 Fashion ID ECLI:EU:C:2019:629.

On whether a website is a controller with a tracker on the website (a Facebook Pixel), and if so, which part of the processing are they a controller in relation to.

Compulsory         Case C-604/22 IAB Europe ECLI:EU:C:2024:214.

Required       Case C252/21 Meta Platforms and Others ECLI:EU:C:2023:537

Particularly concerning the interaction of web tracking data and Article 9 sensitive data characteristics.

Further                  Case C25/17 Jehovan todistajat ECLI:EU:C:2018:551.

Consider how the controllership argument might analogise

Further                  Case C-673/17 Planet49 GmbH ECLI:EU:C:2019:801.

Establishes the Data Protection Directive 1995 has a similar high standard for consent as the GDPR does.

Further                  Case C-131/12 Google Spain ECLI:EU:C:2014:317.

Establishes a search engine can be a controller of personal data (the search index, when queried by name)

 

💡 For both the compulsory cases it can be useful to (additionally) read the Advocate General opinions if you are unclear about the points and context. Remember, the Court does not always agree with these, so if you cite them when writing, use them only analytically and in light of the judgment which followed, not as binding law!

Required       Case C210/16 Wirtschaftsakademie Schleswig-Holstein GmbH ECLI:EU:C:2017:796, Opinion of AG Bot.

Required       Case C-49/17 Fashion ID GmbH & CoKG v Verbraucherzentrale NRW eV ECLI:EU:C:2018:1039, Opinion of AG Bobek.

United Kingdom

These cases are not directly about the substance of tracking but consider the issues of a workaround concerning cookie and the ability to claim damages on that basis. They are included here mainly for completeness, but are also of interest (both the final judgments listed and the prior cases appealed) in relation to how courts understand issues of tracking.

Further                  Vidal-Hall v Google Inc [2015] EWCA Civ 311.

Further                  Lloyd v Google LLC [2021] 3 UKSC 50.

Belgium

Further                  (Regulator Decision) Belgian Data Protection Authority, ‘Decision on the Merits 21/2022 of 2 February 2022, Complaint Relating to Transparency & Consent Framework (IAB Europe), DOS-2019-01377’ (2 February 2022).

Summarised and analysed in Veale, Nouwens and Santos (2022) above. An appeal is currently stayed pending a preliminary reference to the CJEU in Case C-604/22 IAB Europe.

Statute

United Kingdom

Compulsory         Privacy and Electronic Communications (EC Directive) Regulations 2003, reg 6

This derives from the e-Privacy Directive, art 5(3), below.

European Union

Required       Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) OJ L 201/37, art 5(3).

This is implemented in the UK by PECR reg 6, above

T4: Let’s Do It At My Place Instead? Challenges of Encrypted Computation In Advertising and Beyond

 

In response to the accusations of widespread illegality in advertising technologies, some firms have considered a move to ‘on-device’ targeting. This typically involves browsers or operating systems involved in the process of collecting data, analysing it, and delivering adverts. Due to advances in privacy-enhancing technologies and computation, some of these technologies genuinely might improve confidentiality of the data, mitigating the accusations of approaches such as real-time bidding that data flows in an unsecured way to an unlimited number of parties. However, we should perhaps take pause before considering that this solves all issues in advertising. While undertaking detailed targeting of individuals without spreading user data around large numbers of actors might resolve some issues, those that stem from the targeting itself, and related claims around manipulation and autonomy may remain. Furthermore, under data protection law, this raises challenging questions of controllership. Who is the data controller of data processing that happens on a user’s device, even if that user has limited control over it? How can data rights or principles be operationalised? Do browsers and operating systems — previously seen as more passive actors without clear obligations in data protection law — have a greater role to play in our future?

 

Readings

Compulsory         Lee McGuigan, Sarah Myers West and Ido Sivan-Sivlla, ‘The after party: Cynical Resignation in Adtech’s Pivot to Privacy’ (2023) 10 Big Data & Society

Compulsory         Michael Veale, ‘Rights for Those Who Unwillingly, Unknowingly and Unidentifiably Compute!’ in Hans-Wolfgang Micklitz and Giussepe Vettori (eds), The Future of the Person (Hart 2025)

Further                  David Eliot and David Murakami Wood, ‘Culling the FLoC: Market Forces, Regulatory Regimes and Google’s (Mis)Steps on the Path Away from Targeted Advertising’ (2022) 27 Information Polity 259.

Further                  Michael Veale, ‘Future of online advertising: Adtech’s new clothes might redefine privacy more than they reform profiling’ (netzpolitik.org, 25 February 2022)

Further                  Lee McGuigan and others, ‘Private Attributes: The Meanings and Mechanisms of “Privacy-Preserving” Adtech’ [2023] New Media & Society 14614448231213267.

Questions

Consider the readings from the seminar, the lecture, and at least the compulsory reading above.

·       What are the main privacy and other policy challenges from ‘classic’ targeted advertising? What kind of forms does online advertising take, and how does this link to harms and policy challenges?

·       Many proposed advertising techniques take the advertising and targeting into the browser or operating system, rather than collected by remote servers. Is this good or bad for privacy? For data protection?

·       Why might large advertising platforms want to ensure they target on-device, and/or in a way where they cannot see the personal data?

·       Is there a data controller of this on-device advertising? Who? Can they fulfil the full rights and obligations of a data controller?

·       How should a data regulator, or internet law, react to the move to on-device processing?


 

L10: Artificial Intelligence and Data Protection

Professor Orla Lynskey and Dr Michael Veale

 

Questions for reading

-       Under what conditions would an AI model qualify as personal data under the GDPR? What problems does this create?

-       Is it legal to scrape personal data from the Web under data protection law? What are the main aspects that make authorisation difficult, and what are the main tensions this creates in relation to the training of large language models?

-       Do you think data protection law as it stands is the suitable legal mechanism for artificial intelligence? If not, why not?

Policy Documents

Compulsory         European Data Protection Board, ‘Opinion 28/2024 on Certain Data Protection Aspects Related to the Processing of Personal Data in the Context of AI Models’ (EDPB, 17 December 2024).

This is a result of difficult questions posed by the Irish DPA under Art 64(2) to the EDPB, a mechanism which puts the EDPB on the clock to make guidance on the issues.

Further                  Information Commissioner’s Office, ‘Information Commissioner’s Office response to the consultation series on generative AI’ (ICO, 10 December 2024).

This is to be read with the five calls for consultation it collectively responds to, each of which contains further analysis (with colourful diagrams) and is yet to be consolidated, on lawful basis, purpose limitation, accuracy, individual rights, and the allocation of controllership.

Further                  The Hamburg Commissioner for Data Protection and Freedom of Information, ‘Discussion Paper: Large Language Models and Personal Data’ (Datenschutz Hamburg, 15 July 2024).

This paper stirred a lot of discussion in mid-2024 as coming down extremely strongly against language models as being classified as personal data under any circumstances — the ICO and EDPB papers above disagree with it (and the EDPB paper, adopted under Art 64(2) GDPR, will likely be very influential in cross-border opinions and cases in future as well as the CJEU, although it does not yet finally bind Member State authorities).

Articles and Chapters

Compulsory         Reuben Binns and Lilian Edwards, ‘Reputation Management in the ChatGPT Era’, Oxford Handbook on the Foundations and Regulation of Generative AI (Oxford University Press 2025) section 3.2.

Required       Michal Gal and Orla Lynskey, ‘Synthetic Data: Legal Implications of the Data-Generation Revolution’ (2023) 109 Iowa Law Review 1087.

Further                  Michael Veale, Reuben Binns and Lilian Edwards, ‘Algorithms That Remember: Model Inversion Attacks and Data Protection Law’ (2018) 376 Phil. Trans. R. Soc. A 20180083.

News Reporting

Further                  Jason Koebler, ‘Not Just “David Mayer”: ChatGPT Breaks When Asked About Two Law Professors’ (404 Media, 2 December 2024).

Cases

European Union

Required       Case C136/17 GC and Others v Commission nationale de l’informatique et des libertés (CNIL) ECLI:EU:C:2019:773.

In this case, the CJEU interprets what appears to be an unambiguous, strict obligation in the GDPR to be a little less than that, given the ‘responsibilities, powers and capabilities’ of the controller. Will the same thing happen with generative AI?