LAWS0339 Internet Law and Policy

Convenor: Professor Michael Veale, co-taught with Professor Orla Lynskey and Dr Tommaso Fia.

Faculty of Laws, University College London.

Academic Year 2025–6. Document Version December 2025.

 

About the Module. 2

Seminar 1 Welcome to the Internet 3

Seminar 2 How To Control the Internet 4

Seminar 3 Blocking and Networked Rule-Making: the Domain Name System.. 5

Seminar 4 Liability Shielding and Safe Harbours. 7

Seminar 5 Content Moderation Obligations. 11

Seminar 6 Fair Moderation and Takedown Abuse. 13

Seminar 7 Specific and General Monitoring. 15

Seminar 8 By-Design Platform Regulation. 17

Seminar 9 Verification: Ages, Names, Identities. 19

Seminar 10 Moderating AI: Models and Outputs. 21

Seminar 11 Privacy Online. 23

Seminar 12 Data Protection’s Scope. 25

Seminar 13 Data Protection’s Substance. 28

Seminar 14 Data Rights and Wrongs. 30

Seminar 15 Online Tracking: Cookies and Controllers. 33

Seminar 16 International Transfers of Personal Data 36

Seminar 17 Emerging Models for Data Governance. 38

Seminar 18 Automated Decision-Making. 40

Seminar 19 Language Models & Personal Data. 42

Seminar 20 Can Law Change The Internet?. 43


 

About the Module

Office Hours

Professor Veale’s office hours are linked on Moodle and should be your first port of call. You can attend these across all topics. For topics taught by Professor Lynskey or Dr Fia, you can also attend their office hours. Links are on Moodle. Attendance at office hours is very encouraged, regardless of the kind of question you have.

Attendance

Attendance at all seminars is compulsory. There are no tutorials. Attendance will be taken. Students with insufficient attendance will, as described in the LLB handbook, not be permitted to take the exam. We do not guarantee that the session will be recorded and in any case, much or all of the session will be discussion based, which will not be recorded and will be the site of important learning and discussion which will be examinable.

Reading guidance

There is no comprehensive, up-to-date textbook for Internet Law, nor any up-to-date casebooks. Reading the cases and statutes and making your own notes is incredibly important. Much material is extremely new and little, if any, reliable public commentary exists. If you are able, we strongly recommend bringing printed copies to class and making notes on these.

Reading priority guidance in this syllabus

Read Before Seminar  § this means you must read it before the seminar. Seminars are going to be discussion-based and you should have read the readings with the questions given in mind.

Recommended                § this means you should read it before the exam, or when writing an essay on this topic.

Further          § this means you can read it if interested in the topic, or if you have read the required readings and still feel you lack understanding (although also come to office hours then!)

 

Note that readings that are in these categories may be spread across the multiple sections, such as Readings, Case Law and Statute. Ensure that you check through the whole week.

Readings are mostly linked from this syllabus. If the links don’t work or have changed, then they should be easy to find if you try the UCL Library search, and then a normal search engine or Google Scholar.

Formative Assessment

There will be two mandatory essays for the module, in Term 1 and Term 2. These can be submitted online. You are welcome to get practice in closed-book handwriting the assessment and upload a scanned version to Moodle and I will mark it as if it was taken in exam conditions, else I will mark it as an open-book essay.

Summative Assessment

There will be a 3-hour closed book exam in Term 3. You will have a choice from multiple questions.


 

Seminar 1 Welcome to the Internet

Professor Michael Veale

We all use the Internet. But how many people know what it is; where it came from; how it works? This session will introduce the history of the Internet and the functioning of core Internet technologies. Why were they designed as they were; and with whose values at the core? We will start to see why and how these design decisions interact with law and policy concerning the online world; a theme that will recur throughout the module.

Questions to prepare for the seminar

·       What are the core technologies underpinning the Internet and how (in very rough terms) do they work?

·       What is the difference between the Internet and the Web?

·       What technical design principles underpin the Internet? Whose values do they reflect? How do these compare to broader values, for example those reflected in human rights regimes?

·       What are some of the main actors and organisations that govern the Internet, and how do they make decisions?

·       What different lenses might we view what the ‘internet’ is?

·       What is ‘cyberlibertarianism’?

 

Note   As legal scholars we have to understand technologies enough to be able to reason about how law and policy applies to them, not to be able to deploy them ourselves. So don’t worry if you don’t understand everything, or it seems too technical — focus on trying to get a general understanding.

Readings

Read Before Seminar  Andrés Guadamuz, ‘Internet Regulation’ in Lilian Edwards (ed), Law, Policy, and the Internet (Hart Publishing 2019)

         A useful and broad-ranging introduction into what it is to study Internet law and policy.

Read Before Seminar  William Lehr and others, ‘Whither the Public Internet?’ (2019) 9 Journal of Information Policy 1. read: pages 1-20

         This article very usefully unpacks three different ways of understanding what “the Internet” is. Parts of this article use some difficult terminology: it is not key you understand it all.

Further          Corinne Cath-Speth, ‘Internet Histories: Partial Visions of People and Packets’ in Corinne Cath-Speth, Changing Minds and Machines: A Case Study of Human Rights Advocacy in the Internet Engineering Task Force (IETF) (DPhil Thesis, Oxford University, 2021) pages 27-51.

         Traces and critiques the cultural and historical values behind the Internet engineers often seen as the main characters in a predominantly white, male, biographical history of the Internet.

Further          Malte Ziewitz and Ian Brown, ‘A Prehistory of Internet Governance’ in Research Handbook on Governance of the Internet (Edward Elgar Publishing 2013).

         A history of the Internet and its early governance, as well as the institutions involved. Perhaps the best of the many “great men” tales of Internet history critiqued by Cath-Speth (2021).

Further          Kieron O’Hara and Wendy Hall, Four Internets: Data, Geopolitics, and the Governance of Cyberspace (Oxford University Press 2021). paywall link / UCL link

         A recent overview of the way in which the Internet as we know it is diverging as different nations and jurisdiction have different views on its development and trajectory.

Seminar 2 How To Control the Internet

Professor Michael Veale

The Internet is sometimes described by politicians as a “Wild West”; a regulation-free land that has long resisted control. According to some, it needs bringing to heel. This is an overly simplistic story. In this session, we will build on the previous session’s understanding of Internet technologies and their design principles to think both about the ways in which this network has resisted control, and the ways in which it has been controlled by both public and private actors.

Questions to prepare for the seminar

·       Does code ‘regulate’? How does this differ from the types of regulation that lawyers are more familiar with discussing?

·       What does it mean to say the Internet is ‘generative’? What are the policy implications of generative technologies?

·       Which actors are well-positioned to regulate the Internet, and why? What factors make it easier or harder for these actors to alter the working of these networks?

Articles

Read Before Seminar  Jonathan L Zittrain, ‘The Generative Internet’ (2006) 119 Harv L Rev 1974. OA link

         This article is in some ways a prediction of how ‘walled gardens’ online might emerge. Was Zittrain right?

Read Before Seminar  Julie E Cohen, ‘“Piracy,” “Security,” and Architectures of Control’ in Configuring the Networked Self: Law, Code, and the Play of Everyday Practice (Yale University Press 2012). OA link

         A strong critique of a variety of ways of looking at ‘code’ and ‘law’ - this might be worth coming back to after we talk about these topics some more as it’s a very dense and rewarding read.

Further          Lawrence Lessig, Code: Version 2.0 (Basic Books 2006) OA link

I suggest looking at pages 61-137 (chapters 5-7). Lessig is referred to a lot in the other readings, and if writing about him you should have a look at what he says in his own words.

Further          Clément Perarnaud and others, ‘“Splinternets”: Addressing the Renewed Debate on Internet Fragmentation’ (European Parliamentary Research Service, 2022).

Further          Niels ten Oever, ‘“This is Not How We Imagined It”: Technological Affordances, Economic Drivers, and the Internet Architecture Imaginary’ (2021) 23 New Media & Society 344. OA link

This paper is good for those interested in some of the ways in which the technical sides of standards are value laden and contested. Further readings in this vein can be found in Beatrice Martini, ‘Internet Infrastructure and Human Rights: Reading List’ (Stanford PACS, 2020).

 

Seminar 3 Blocking and Networked Rule-Making: the Domain Name System

Professor Michael Veale

In this session we will both consolidate our understanding of theories of networked governance, and examine a tangible example, the DNS system. This example will both help you increase your confidence with the theoretical side of internet law, while helping you learn how to grapple with technical documents as lawyers.

The Internet functions on protocols — a common language and grammar that enables them to work effectively and at scale. Changing these protocols can change the way the Internet works, and its possibilities and uses.

The domain name system is a core part of that way that the Internet works. It effectively allows us to browse content without having to remember addresses that were designed only for computers (in particular IP addresses that look like 43.157.242.47). Instead, we just remember simple domains — that’s everything before the first ‘/’.

However, DNS has also been described as the “Achilles heel” of the Internet as it has been a serious ground of contestation around issues such as privacy and Internet blocking. This is effectively because if you can control the address book of the Internet, you make resources a lot harder to discover.

Most recently, a series of proposals have been made which significantly changed the way that this system works – in particular by encrypting it. These services are typically called DNS over HTTPS — or ‘DoH’, because HTTPS is the encrypted way of delivering services on the Web (the ‘S’ is for Secure). These interact and interfere with a range of existing mechanisms to block content, and so have been controversial. But who is responsible for this change? The standard setters? The Web browsers? The internet service providers? Legislators? We will use DNS-over-HTTPS as an example to answer these questions.

💡 This topic is technical but we will approach it as law scholars. You are not expected to know the technical details of DNS, but you are expected to understand the broad way it works, and the types of changes that encrypting it will lead to. Try and avoid getting in the weeds and continue to ask the question of ‘so what?’ when you read more technical work, thinking of the legal and policy consequences you might imagine. The ‘so what?’ question should guide your reading — what questions do you really need answers to that are technical, and which can you abstract away and ignore?

Questions to prepare for the seminar

·       What is DNS? Identify the actors in it. What is important to know about DNS in relation to law and policy? How is DNS used in Internet blocking? Who does this blocking?

·       Which actors have the power to decide whether encrypted DNS systems, such as DNS-over-HTTPS (DoH) are in practice introduced? What does this mean for the power of private firms vis-a-vis the state? How does this fit into the discussion in Cohen (2019)?

·       The UK’s Internet blocking regime is heavily built on DNS filtering. Should this be considered when encrypted DNS is introduced? Who should consider it, and how? How might this work internationally?

Readings

Read Before Seminar  Julie E Cohen, ‘Networks, Standards, and Transnational Governance Institutions’, Between Truth and Power: The Legal Constructions of Informational Capitalism (Oxford University Press 2019).

         This book chapter looks at the ways in which law interacts with the governance of networks, and the role of powerful actors in their governance.

Read Before Seminar  Video 📹: Mike Pound, How DNS Works (Computerphile, University of Nottingham, 9 July 2020).

Read Before Seminar  Open Rights Group, ‘DNS Security — Getting it Right’ (24 June 2019)

         The Open Rights Group (ORG)is a UK-based digital rights NGO. This report outlines many of the technical aspects of DNS, and the consequences of this, written from the standpoint of a digital rights advocacy organisation.

Read Before Seminar  Alex Hern, ‘Firefox: “no UK Plans” to Make Encrypted Browser Tool Its Default’ The Guardian (24 September 2019)

Recommended                Internet Infrastructure Coalition, ‘DNS at Risk: How Network Blocking and Fragmentation Undermine the Global Internet’ (May 2025)

         This report is by an umbrella organisation of internet infrastructure companies. It does not look at the UK, but looks at government used of DNS in blocking in many other countries around the world. You do not need to look at this report in detail; focus on the start, the end, and skim the examples — most are full pages so the report is not actually so long.

Further          Debate in Hansard on “Internet Encryption” (HL Deb 14 May 2019, vol 797, cols 1492–1495)

         A good opportunity to see one of the strangest legislative chambers in the world talk about a highly technical issue.

Further          Internet Watch Foundation, ‘Briefing on DNS-over-HTTPS

         A briefing that in many ways poses a counterpoint to the Open Rights Group paper.

Further          Internet Society, ‘Encrypted DNS Factsheet’ (May 2023)

A complement to the ORG paper above, shorter. Contains diagrams around DNS and a shorter description of some of the different versions of encrypted DNS (the distinctions between which are not necessary to focus on).


 

Seminar 4 Liability Shielding and Safe Harbours

Professor Michael Veale

Who is responsible for what happens on the Internet? The companies with the biggest pockets often sit in between the creator of content and the user viewing it. Furthermore, it can be difficult to identify who says what online, particularly where they are outside of jurisdiction, or do so in a technical way to mask their identity. Yet many times online people — with good or nefarious motives — want content to go away. The obvious actors to go after in this setting are the middlemen — a company like Google or Facebook who in theory have the ability and resources to stop content being distributed. However, this has a fraught history, and if such companies were responsible for everything, it would be unlikely that they would want to promote expression particularly openly and freely. Since the late 1990s, law has responded, creating a regime of shielding for many companies that some hail as central to the development of the internet and others lament has been the foundation of a culture of online irresponsibility. We’ll look at the origins and underlying legal structures in this session, before moving on in future sessions to look at more complex elements of this landscape and how they have developed over time.

Questions to prepare for the seminar

·       What was the initial logic behind intermediary liability laws? Was that logic legitimate at the time, and does it remain so today?

·       What kind of actors are “hosting” intermediaries? How broad is this term? What kind of internet content services are not included in it?

·       What factors has the CJEU indicated do not compromise the shielding of intermediaries? Do you agree with these judgments?

·       What is the concept of ‘neutrality’ that the CJEU has built? Are modern platforms neutral in this way?

·       What is the purpose of the concept of the ‘diligent economic operator’?

Articles and Chapters

Read Before Seminar  Martin Husovec, ‘Introduction to Liability FrameworkandLiability Exemptions: General Requirements’ and Liability Exemptions: Specific Services’, Principles of the Digital Services Act (Oxford University Press 2024).

Recommended                Lilian Edwards, ‘“With Great Power Comes Great Responsibility?”: The Rise of Platform Liability’ in Lilian Edwards (ed), Law, Policy, and the Internet (Hart Publishing 2019). UCL link

         This chapter looks over the history of intermediary liability law. Read if you are finding it difficult to understand the motivation and context of these laws even after reading Husovec (2024). Does not cover the DSA due to publication date but provides strong context up until that point.

A slightly more up to date overview than the Edwards chapter above, with EU focus.

Further          Folkert Wilman, ‘Between preservation and clarification: The evolution of the DSA’s liability rules in light of the CJEU’s case law’ in Joris Van Hoboken and others (eds) Putting the DSA into Practice: Enforcement, Access to Justice, and Global Implications (Verfassungsbooks 2023).

Further          Aleksandra Kuczerawy, ‘From “Notice and Take Down” to “Notice and Stay Down”: Risks and Safeguards for Freedom of Expression’ in The Oxford Handbook of Intermediary Liability Online (Oxford University Press 2020). UCL link

Further          Graham Smith, ‘5.6 Liability of Online Intermediaries’ in Internet Law and Regulation (Sweet and Maxwell 2020) Book available on Westlaw UK

Part of a practitioner textbook on Internet Law. Goes into heavy detail on the jurisprudence in the area.

Further          Nico van Eijk and others, Hosting Intermediary Services and Illegal Content Online: An Analysis of the Scope of Article 14 ECD in Light of Developments in the Online Service Landscape: Final Report. (European Commission 2019).

Further          Philipp Hacker, Andreas Engel and Marco Mauer, ‘Regulating ChatGPT and Other Large Generative AI Models’, Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (Association for Computing Machinery 2023).

Do AI systems benefit from intermediary shielding under the Digital Services Act? In section 5, Hacker and colleagues argue they do not.

Further          Jaani Riordan, ‘Defamation’ in Jaani Riordan (ed), The Liability of Internet Intermediaries (Oxford University Press 2016) (UCL 🔒)

This chapter covers UK developments relating to intermediary liability shielding and defamation.

Statute

European Union

Read Before Seminar  (“DSA”) Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1, arts 4, 5, 6, 7, 8.

         This is the new intermediary liability law applicable in the European Union. It does not apply in the UK as it was passed after Exit Day. It contains many other platform provisions which we will look at later in the module — for now just focus on the liability shielding sections.

Further          (Parts repealed by the DSA) Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) OJ L 178/1, recitals 40-49, arts 1, 12–15.

The Digital Services Act (DSA) replaces the relevant parts of the e-Commerce Directive from February 2024. ECD, arts 12–15 map to DSA arts 4, 5, 6, and 8. Article 7 of the DSA is new. When writing on this topic, while the UK still has the ECD and its derived rules (see box below), the EU now has the DSA. Case law typically refers to the ECD and will still be important in interpreting the DSA due to its extremely similar phrasing and substance.

United Kingdom

Further          The Electronic Commerce (EC Directive) Regulations 2002 ss 17–19.

 

🇬🇧 Brexit Note 🇪🇺: The Digital Services Act (DSA) replaces the relevant parts of the e-Commerce Directive from February 2024. ECD, arts 12–15 map to DSA arts 4, 5, 6, and 8. The DSA does not apply in the UK, which will still use the transposition of the ECD (The Electronic Commerce (EC Directive) Regulations 2002). These regulations do not explicitly contain art 15 of the ECD, the general monitoring prohibition, as the UK relied on the vertical direct effect of the ECD to apply that provision (i.e. to prevent Parliament adopting contrary laws, and prevent emanations of the state from applying general monitoring obligations), and that effect has lapsed. We do not have an equivalent of DSA art 7, as has been noted (and was consequential) in Montres Breguet SA & Ors v Samsung Electronics Co Ltd & Anor [2023] EWCA Civ 1478 [103] (a judgment which in obiter also seems to be mistaken about the Article 15 omission).

United States

🇺🇸 US law in this area centres on ‘Section 230’. This part of US law has its own mythology and saga-filled history. If interested, you can follow it in Jeff Kosseff, The Twenty-Six Words That Created the Internet (Cornell University Press 2019) (UCL 🔒), and for an alternative, more critical view on this provision, see Mary Anne Franks, ‘The Cult of the Internet’, The Cult of the Constitution (Stanford University Press 2019) (UCL 🔒).

Recommended                47 U.S.C. 230(c) (‘Section 230’).

This is the broad-ranging liability shield in US law which does not have a notice and takedown requirement. Often referred to, technically incorrectly, as ‘Section 230 of the Communications Decency Act’, which amended that section into existence. This is such a common error that it is now more a shorthand, and even appears to be cited that way by the US Supreme Court (see Justice Alito in Moody v NetChoice, 603 U.S. 707 (2024) at 7), so feel free to refer to it that way.

Further          17 U.S.C. § 512 (Digital Millenium Copyright Act, “Limitations on liability relating to material online”)

This section of the DMCA is the part of US law which mirrors the mechanisms in UK and EU law — a notice and takedown regime for intermediaries but only in relation to copyright infringement, compared to a more horizontal regime in the EU, UK and many other jurisdictions.

Case Law

European Union

Read Before Seminar  Case C-682/18 YouTube and Cyando ECLI:EU:C:2021:503 paras 18-39, 103-118.

         A case concerning whether certain platform features give an “active role” to intermediaries. Note: English courts typically call this case “Peterson” related to the other joined party, so you might see it referred to as that.

Recommended                Joined Cases C-236/08 and C-238/08 Google France ECLI:EU:C:2010:159

Focus on paras 22-32 and 106-120, you don’t need to understand the relevant trademark law.

Recommended                Case C324/09 L’Oréal SA v eBay ECLI:EU:C:2011:474

Consider what being a ‘dilligent economic operator’ means in this judgment.

Recommended                Case C-492/23 Russmedia Digital ECLI:EU:C:2025:935

         This case, handed down in late 2025, may shake up a few things in the interaction between data protection law and intermediary liability. Read with care and consider again after studying data protection law.

Further          Case C-492/23 Russmedia Digital ECLI:EU:C:2025:68 (Opinion of AG Szpunar)

This case looks at the interplay between the GDPR and the e-Commerce Directive. The Advocate General argues that neutral hosts should typically be treated as processors. This may be difficult to follow until after we have studied data protection law. Note that the court did not follow this advice — but contrasting the two is enlightening.

 

💡 Reading the Advocate General opinions for these cases can elaborate on how they link to previous case law, and they are written in a less robotic way than the CJEU (although remember that the Court does not always follow the reasoning of the AG.) You can access these on the CJEU’s own webpage using the Curia search by entering the case number.

European Court of Human Rights

🇪🇺 The case-law of the ECtHR on intermediary liability has at time seemed at odds with the CJEU. The case of Delfi v Estonia received criticism for seeming to ignore EU law. While the ECtHR seemed to draw some boundaries around the Delfi case, these seem to have been radically loosened in the more recent 2023 Grand Chamber judgment in Sanchez v France, which did not find a violation of freedom of expression in a case where the French state held a politician criminally liable for not removing hateful comments under his post on a friends-only Facebook page, despite not being explicitly notified of these posts by those taking the action.

Recommended                Sanchez v France ECLI:CE:ECHR:2023:0515JUD004558115

See analysis of the potential effect of this judgment in Jacob van de Kerkhof, ‘Sanchez v France: The Expansion of Intermediary Liability in the Context of Online Hate Speech’ (Strasbourg Observers, 17 July 2023).

Further          Delfi AS v Estonia ECLI:CE:ECHR:2015:0616JUD006456909

Can this case be reconciled with EU law? Arguments that it can be, such as Aleksandra Kuczerawy and Pieter-Jan Ombelet, ‘Not so Different after All? Reconciling Delfi vs. Estonia with EU Rules on Intermediary Liability’ (Media@LSE, 1 July 2015), seem to rely on an opposite finding the CJEU in Youtube and Cyando, and now confirmed in the recitals of the DSA, that knowledge of a violation needs to be specific, not just general.

Further          MTE v Hungary ECLI:CE:ECHR:2016:0202JUD002294713

Further          Pihl v Sweden ECLI:CE:ECHR:2017:0207DEC007474214

Both MTE v Hungary and Pihl v Sweden are cases where the ECtHR attempts to distinguish from Delfi on the basis of the extremity of the speech in question — something quite different from the DSA’s intermediary liability regime, which provides shielding without considering the nature of the content.

 

💡 The ECtHR maintains a list of decided and pending cases relating to freedom of expression and hate speech online within their Press Unit’s Hate Speech factsheet (under ‘Online Hate Speech’), although it is not always fully updated.

United Kingdom

Recommended                Montres Breguet SA & Ors v Samsung Electronics Co Ltd & Anor [2023] EWCA Civ 1478.

See [93]–[105]. This case concerns watch faces which infringe Swatch trademarks and the Samsung Galaxy Store (for its watches). By operating an app store in the way that it did, such as by checking apps against its ‘content review guide’, Samsung took an ‘active’ role and was thus refused hosting liability shielding.

Further          Payam Tamiz v Google Inc [2013] EWCA Civ 68

Establishes at common law that an intermediary such as Google providing Blogger.com would not be a publisher of comments made by a user of that platform (further shielding that may apply were the defence in the e-Commerce Directive to be removed).

Further          Cartier International AG & Ors v British Telecommunications Plc & Anor [2018] UKSC 28

This case is largely of tangential interest, but to date is the only time the UK Supreme Court has been asked to consider a question relating explicitly to the liability of internet intermediaries.


 

Seminar 5 Content Moderation Obligations

Professor Michael Veale

Platforms that host content uploaded by users face a range of pressures to remove or limit it. Historically these have included market pressures to try and attract users (recall the early Compuserve case from the Liability Shields session). Today, advertisers might refuse to advertise, payment processors such as Visa, Mastercard or PayPal might refuse to serve, and free-standing legal obligations might kick in in a new generation of platform laws.

Questions to prepare for the seminar

·       What are some of the reasons that platforms remove or limit the spread of certain user-generated content?

·       What is different, and what is similar, between the approaches in the EU Digital Services Act and the UK Online Safety Act, to regulating content moderation? Do you prefer one or the other?

·       What is a ‘hash database’ in content moderation?

·       When do algorithms support content moderation, and when might they fail?

Articles

Read Before Seminar  Kate Klonick, ‘The New Governors: The People, Rules, and Processes Governing Online Speech’ (2017) 131 Harvard Law Review 1598

         This paper scans the history of the governance of content moderation in US firms. It argued that content moderation is rooted implicitly in a US First Amendment mentality and culture. To what extent do you think that is still true?

Read Before Seminar  Robert Gorwa, Reuben Binns and Christian Katzenbach, ‘Algorithmic Content Moderation: Technical and Political Challenges in the Automation of Platform Governance’ (2020) 7 Big Data & Society 2053951719897945.

         This paper looks at the growth of algorithmic systems in content moderation, and the limitations that come from using them.

Read Before Seminar  Stephanie Law, ‘Effective Enforcement of the Online Safety Act and Digital Services Act: Unpacking the Compliance and Enforcement Regimes of the UK and EU’s Online Safety Legislation’ (2024) 16 Journal of Media Law 263.

Recommended                João Pedro Quintais, Naomi Appelman and Ronan Ó Fathaigh, ‘Using Terms and Conditions to Apply Fundamental Rights to Content Moderation’ (2023) 24 German Law Journal 881.

Further          Konstantina Palla and others, ‘Policy-as-Prompt: Rethinking Content Moderation in the Age of Large Language Models’, Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency (Association for Computing Machinery 2025).

This looks at the much more recent paradigm of baking a content moderation policy into a prompt of a large language model to translate it into practice, its possibilities and limitations and its effect on content moderation bureaucracies.

Further          Jacob JW van de Kerkhof, ‘Jawboning Content Moderation from a European Perspective’ in Charlotte van Oirsouw and others (eds), European Yearbook of Constitutional Law 2023: Constitutional Law in the Digital Era (TMC Asser Press 2024)

Further          Christoph Busch, ‘Regulating the Expanding Content Moderation Universe: A European Perspective on Infrastructure Moderation’ (2022) 27 UCLA Journal of Law and Technology 32.

Statute

European Union

Read Before Seminar  Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1,

Ensure to look at least at: arts 14(4) (Terms and conditions), 20 (Internal complaint-handling system), 21 (Out-of-court dispute settlement), 22 (Trusted flaggers), 23 (Measures and protection against misuse), 24 (Transparency reporting obligations for providers of online platforms), 33 (Very large online platforms and very large online search engines), 34 (Risk assessment), 35 (Mitigation of risks), 37 (Independent audit), 38 (Recommender systems), 39 (Additional online advertising transparency), 40 (Data access and scrutiny), 44 (Standards), 74 (Fines). In relation to art 24, look at the linked database that has been established as a result of that article.

United Kingdom

Read Before Seminar  Online Safety Act 2023, part 3 ch 2 (“Providers of user-to-user services: duties of care”, read with interpretation in ch 7 as required), part 4 ch 3 (“Terms of service: transparency, accountability and freedom of expression”)

Case Law

European Court of Human Rights

Recommended                Sanchez v France (ECtHR, Grand Chamber, App no 45581/15) ECLI:CE:ECHR:2023:0515JUD004558115

Concerns obligations to remove hate speech. See analysis of the potential effect of this judgment in Jacob van de Kerkhof, ‘Sanchez v France: The Expansion of Intermediary Liability in the Context of Online Hate Speech’ (Strasbourg Observers, 17 July 2023).

Kenya

Further          Meareg & ors v Meta Platforms (Petition E541 of 2022) [2025] KEHC 4362.

An ongoing case, allowed to proceed in the High Court, challenging how Facebook’s algorithmic and content moderation actions affected violence in Ethiopia.

Further          Arendse & 42 ors v Meta Platforms, Inc & 3 ors (Constitutional Petition E052 of 2023) [2023] KEELRC 1398.

See paragraph 4 of this case for the alleged psychological impact of content moderators in the Ethiopian context (litigation in Kenya due to the location of the content moderation contractor, Sama). Note that the interim relief in this case however has been struck out in Samasource EPZ Limited t/a Sama v Meta Platforms, Inc & 186 others (Civil Appeal E595 of 2023) [2024] KECA 1152. See further on this case Robert Booth and Caroline Kimeu, ‘PTSD, Depression and Anxiety: Why Former Facebook Moderators in Kenya Are Taking Legal Action’ (The Guardian, 18 December 2024).

United States

Further          Lindke v. Freed 601 U.S. 187 (2024).

This US Supreme Court case examines when a public official can block or delete comments from a social media account they hold, given that sometimes, they might be acting in a governmental capacity and subject to First Amendment restrictions themselves. It holds that there needs to be a distinction between which posts are private posts in nature, and which are speaking in a government capacity, although there are difficult and mostly unresolved questions as to blocking, which might operate on an account or page basis rather than an individual. A previous case involving Donald Trump looked at this matter, and also restricted his blocking or deleting of comments, but was vacated by the Supreme Court when Joe Biden was elected due to the issue becoming moot.

 

💡 The Global Freedom of Expression project at Columbia University maintains an easy-to-use database of cases around the world that concern free expression, many of them touching upon tricky issue of content moderation. Note that the database does not always update if cases are later overturned, so check them in legal databases for that purpose.

Seminar 6 Fair Moderation and Takedown Abuse

Professor Michael Veale

Liability shields can be lifted by actual knowledge of illegal content, which creates the possibility that liability might flow to the intermediary rather than an end-user. Other regimes create actual obligations to take down content (rather than just the ominous threat of potential civil or even criminal action that may never materialise). This means that a notification or a complaint can be a powerful thing. On the Internet, people misuse powerful things. This is combined with the fact that such notifications or complaints must be interpreted by intermediaries, with discretion — sometimes called private governance — that can lack the kind of accountability and care we might seek from salient societal decisions.

Questions to prepare for the seminar

·       What are the legal mechanisms which bind platforms in different jurisdictions to receive and act on complaints? What considerations do they have to take into account?

·       In what ways might takedowns, account suspensions, and similar be abused?

·       What was Facebook/Meta trying to do with the “Oversight Board”. Look at its website. Is it window dressing or a real process?

·       What rights do users have under the DSA or OSA to challenge content moderation decisions they disagree with? Do you think these will be effective?

·       What measures do you think should surround and govern content moderation decisions online?

Readings

Read Before Seminar  Courtney C Radsch, ‘Weaponizing Privacy and Copyright Law for Censorship’ (Centre for International Governance Innovation, CIGI Papers No. 276, May 2023)

         This report looks at how copyright law, focussing mostly on the DMCA in the United States (a notice-and-takedown regime) has been used by certain interests as a tool to takedown content that goes beyond a bone fide claim by a rightsholder to impinge upon genuine expression.

Read Before Seminar  Evelyn Douek, ‘The Meta Oversight Board and the Empty Promise of Legitimacy’ (2024) 37 Harvard Journal of Law & Technology 373.

         This is a commentary on Facebook’s attempt to make a pseudo-court-like appeals body for controversial moderation decisions. You can see some of the cases the body has taken here. Note how they are often written in a legal style.

Recommended                Colten Meisner, ‘The Weaponization of Platform Governance: Mass Reporting and Algorithmic Punishments in the Creator Economy’ (2023) 15(4) Policy & Internet 466.

         This paper interviews different creators and social media users who have had coordinated attempts to takedown their materials as a form of censorship.

Recommended                Martin Husovec, ‘Fair Moderation Process’, Principles of the Digital Services Act (Oxford University Press 2024).

Further          Ben Wagner and others, ‘Regulating Transparency? Facebook, Twitter and the German Network Enforcement Act’ (2020) Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency. 8 min talk on the paper

This paper looks at the design patterns that make it difficult for users to submit a legally binding takedown notice under (now superseded) German content moderation law. It looks at how Facebook buries the options to make a legally binding complaint very, very deeply, and argues that this explains why very few complaints are made this way in Germany.

Further          Kate Klonick, ‘The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression’ (2019) 129 Yale Law Journal 2418.

This is an earlier piece than Douek (2024) above, and arguably more positive and hopeful about the potential of the Oversight Board.

Further          Jennifer Urban and others, ‘Notice and Takedown in Everyday Practice’ (UC Berkeley Public Law Research Paper No. 2755628, 2017)

Read the introduction & executive summary (pp 1-13) and delve into the bits that interest you — it’s a long report, but it is really an excellent view into how notice-and-action schemes work on-the-ground. It’s also very good practice for navigating a long document and pulling out the bits you are drawn to!

Statute

United Kingdom

Read Before Seminar  Online Safety Act 2023 s 17–23, 71–72, and relevant interpretative sections.

United States

Further          17 U.S.C. § 512 (Digital Millenium Copyright Act, “Limitations on liability relating to material online”)

See in particular paragraph (g), describing the counter-notice procedure. Do you think that this would be used in practice? Additionally, note that (c)(3)(vi) establishes a penalty of perjury for a false notice (i.e. claiming to be the rightsholder when the complainant is not).

European Union

Read Before Seminar  Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1,

         Ensure to look at least at: arts 14(4) (Terms and conditions), 16 (Notice and action mechanisms),

20 (Internal complaint-handling system), 21 (Out-of-court dispute settlement), 22 (Trusted flaggers), 23 (Measures and protection against misuse), 24 (Transparency reporting obligations for providers of online platforms). In relation to art 24, look at the linked database that has been established as a result of that article. You may wish to refer to Martin Husovec, ‘Fair Moderation Process’, Principles of the Digital Services Act (Oxford University Press 2024) if you are struggling to make sense of these provisions.

Seminar 7 Specific and General Monitoring

Professor Michael Veale

Intermediary liability shielding, as established in the previous session, immunises certain actors against liability in certain situations, sometimes conditional on them doing or not doing a certain thing. They often have no obligation to look for content actively, which would, potentially, remove their immunity. To make this meaningful, EU law contains a prohibition on imposing ‘general monitoring’ obligations. In this session, we will consider what this means, and how far it might really apply today, in a complex Internet landscape of intermediaries that do (and can do) more than just host whatever content they are told.

Questions to prepare for the seminar

·       Why might a prohibition on general monitoring obligations be justified? What kind of issues it is trying to prevent?

·       Has the CJEU changed its understanding of general monitoring over time? What are the arguments for and against this kind of change?

·       Is it easy to search for illegal content now we have more advanced technologies to help us do it? What can go wrong? Is it just a matter of time before they become good enough?

Articles

Read Before Seminar  Martin Husovec, ‘Prohibition of General Monitoring Obligations’ in Martin Husovec (ed), Principles of the Digital Services Act (Oxford University Press 2024).

Read Before Seminar  Daphne Keller, ‘Facebook Filters, Fundamental Rights, and the CJEU’s Glawischnig-Piesczek Ruling’ (2020) 69 GRUR Int 616. paywall link / UCL link

         This paper is critical of the Glawischnig-Piesczek case, arguing that many parts of it might end up in overreach.

Recommended                Christina Angelopoulos and Martin Senftleben, ‘An Endless Odyssey? Content Moderation Without General Content Monitoring Obligations’ (SSRN Working Paper, 2021).

Recommended                Joris Van Hoboken and Daphne Keller, ‘Design Principles for Intermediary Liability Laws’ (Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression Working Paper, 8 October 2019)

         This guide looks at the different configurations and components commonly found in international laws and proposals around intermediary liability.

         This paper critically considers the limits of automation of content moderation policies.

Further          Jennifer Cobbe and Jatinder Singh, ‘Regulating Recommending: Motivations, Considerations, and Principles’ (2019) 10(3) European Journal of Law and Technology.

         This article predated the case in Youtube and Cyando and so please read the discussions of recommenders and European intermediary liability law in the context of that case.

Further          Aleksandra Kuczerawy, ‘General Monitoring Obligations: A New Cornerstone of Internet Regulation in the EU?’ in Rethinking IT and IP Law - Celebrating 30 years CiTiP (Intersentia 2019).

Further          Giovanni Sartor, ‘The Impact of Algorithms for Online Content Filtering or Moderation: Upload Filters’ (European Parliament 2020).

Further          Maayan Perel and Niva Elkin-Koren, ‘Accountability in Algorithmic Copyright Enforcement’ (2015) 19 Stan Tech L Rev 473.

Further          Graham Smith, ‘5.6 Liability of Online Intermediaries’ in Internet Law and Regulation (Sweet and Maxwell 2020) Book available on Westlaw UK

Further          Giancarlo Frosio (ed.) Oxford Handbook of Online Intermediary Liability (Oxford University Press 2020) Closed access DOI UCL link

This is a useful tome for further reading on intermediary liability, in particular comparing different regimes beyond the UK, Europe or the United States.

Statute

European Union

Read Before Seminar  (“DSA”) Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1, recital 30, art 8, art 14

Case Law

European Union

Read Before Seminar  Case C-18/18 Glawischnig-Piesczek v Facebook ECLI:EU:C:2019:821

         A defamation case which looks at what kind of obligations can be placed on an intermediary to take down content that is similar to that which is subject of the initial complaint. Look carefully at the wording in the judgment, and consider its link to the availability of search tools, particularly in light of readings above such as Keller (2020) as well as Gorwa et al. (2020).

Recommended                Case C70/10 Scarlet Extended ECLI:EU:C:2011:771 (relates to mere conduits) or Case C-360/10 Netlog ECLI:EU:C:2012:85 (same issue as Scarlet, relates to hosting on Netlog, a now-defunct Belgian social network)

You do not need to read both of these as you will see they are very similar. Scarlet is probably marginally more important to read.

Recommended                Case C-492/23 Russmedia

Further          Case C-314/2 Telekabel ECLI:EU:C:2014:192 paras 26-50, 106-114

Further          Case C-484/14 Tobias Mc Fadden v Sony Music Entertainment Germany GmbH ECLI:EU:C:2016:689.

This case concerned a WiFi network in a shop that was being used to break the law. A court asked the CJEU whether an injunction against the shop was legal according to the e-Commerce Directive if it required scanning of all content, termination of the WiFi network, or password protecting the network. The CJEU stated that out of the three, only the latter was allowed, as long as it was effective and required users to reveal their identity in order to get the password.

Further          Case C-401/19 Poland v Parliament ECLI:EU:C:2022:297

United Kingdom

Recommended                Montres Breguet SA & Ors v Samsung Electronics Co Ltd & Anor [2023] EWCA Civ 1478.

See [93]–[105]. This case concerns watch faces which infringe Swatch trademarks and the Samsung Galaxy Store (for its watches). By operating an app store in the way that it did, such as by checking apps against its ‘content review guide’, Samsung took an ‘active’ role and was thus refused hosting liability shielding. Note that the court indicates that looking for illegal content might not give it an active role. How can we reconcile this with intermediary liability law?

 

Seminar 8 By-Design Platform Regulation

Professor Michael Veale

The design of online platforms has come under scrutiny, as they are accused by many of amplifying different kinds of harmful content or behavioural patterns. While some of this comes down to specific pieces of content, other policy efforts have focussed on the underlying structure of these platforms. How successful will an attempt to map policy challenges to design features be?

Questions to prepare for the seminar

·       What kind of features of online platforms have generated concern in relation to their design? Draw on the readings and your own awareness and experiences.

·       What are the assumptions of regulation-by-design? Can you envisage issues it might be good at fixing, as well as those that might not be?

·       What are the main challenges with implementing a duty such as this? How might a platform make itself resistant to this kind of governance, or use techniques which in practice minimise the impact of this mode of regulation?

Readings

Read Before Seminar  Rachel Griffin, ‘The Law and Political Economy of Online Visibility: Market Justice in the Digital Services Act’ [2023] Technology and Regulation 69.

         This article asks how deep the DSA does or does not go into regulating the underlying business models of social media — what they are, and why they might need changing in order to make real policy change.

Read Before Seminar  Graham Smith, ‘Take Care with That Social Media Duty of Care’ (Cyberleagle, 19 October 2018).

         Initially, academic Professor Lorna Woods with Will Perrin from the Carnegie Trust proposed a short law that inspired the Online Safety Act that suggested social media platforms should have a tort-inspired duty of care. The Act that actually followed, after a very long parliamentary process, had many more duties and while they do resemble duties of care, are not labelled as that. This blog, from veteran internet lawyer Graham Smith of Bird & Bird criticises that approach — read its criticisms in light of what you have read in the Online Safety Act and come to your own views.

Read Before Seminar  Uta Kohl, ‘Toxic Recommender Algorithms: Immunities, Liabilities and the Regulated Self-Regulation of the Digital Services Act and the Online Safety Act’ (2024) 16 Journal of Media Law 301

         This comparative article should contribute to your understanding of the distinction between the EU and UK regimes.

Recommended                Martin Husovec, ‘Will the DSA Work? On Money and Effort’ in Joris Van Hoboken and others (eds) Putting the DSA into Practice: Enforcement, Access to Justice, and Global Implications (Verfassungsbooks 2023).

         This short piece takes a more practical look at the enforcement of the DSA, and the promise (and pitfalls) of some of its provisions.

Recommended                Daphne Keller, ‘Amplification and Its Discontents: Why Regulating the Reach of Online Content Is Hard’ (2021) 1 Journal of Free Speech Law 227.

Recommended                Laurens Naudts and others, ‘Toward Constructive Optimisation: A New Perspective on the Regulation of Recommender Systems and the Rights of Users and Society’ in Natali Helberger and others (eds), Digital Fairness for Consumers (European Consumer Organisation (BEUC) 2024) pp 38–68

Further          Julie E Cohen, ‘Law for the Platform Economy’ (2017–18) 51 UCD L Rev 133.

This piece lays out theory and practice of platforms in the legal order, arguing that platforms construct their power out of law, rather than exist in a lawless space, and considering some of the tactics and approaches they use the be slippery and difficult to govern

Further          María P. Angel and danah boyd, ‘Techno-legal Solutionism: Regulating Children’s Online Safety in the United States’ (2024) CSLAW’24: 3rd ACM Computer Science and Law Symposium.

Critiques how much a ‘safety-by-design’ approach can deal with the areas it targets, or whether it takes a dangerously techno-solutionist approach to issues such as mental health which resist such fixes.

Further          Lorna Woods and Will Perrin, ‘Obliging Platforms to Accept a Duty of Care’ in Martin Moore and Damian Tambini (eds), Regulating Big Tech (Oxford University Press 2021) (🔒 UCL)

Statute

European Union

Read Before Seminar  Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1,

         Ensure to look at least at: 25 (Online interface design and organisation), 27 (Recommender system transparency), 28 (Online protection of minors), 33 (Very large online platforms and very large online search engines), 34 (Risk assessment), 35 (Mitigation of risks), 37 (Independent audit), 38 (Recommender systems), 39 (Additional online advertising transparency), 40 (Data access and scrutiny), 44 (Standards), 74 (Fines).

United Kingdom

Read Before Seminar  Online Safety Act 2023, part 3 ch 2; s 121; and relevant interpretative sections as required.

Singapore

Further          Broadcasting Act 1994, part 10A, division 3.

Further          Info-communications Media Development Authority (IMDA), ‘Broadcasting Act 1994: Code of Practice for Online Safety’ (17 July 2023).

This is the code of practice referred to in the Broadcasting Act 1994 (Singapore) s 45L.It currently applies to the entities listed by IMDA in this document.

Australia

Further          Online Safety Act 2021, part 4.

In relation to proactive duties, while the act does function with codes of conduct, similar to Singapore, social media firms only have reporting obligations on how they are meeting these duties; the eSafety Commissioner does not have substantive powers unless these reporting duties are not met.


 

Seminar 9 Verification: Ages, Names, Identities

Professor Michael Veale

A 1993 New Yorker cartoon proclaimed that “on the Internet, nobody knows you’re a dog”. The same veneer of anonymity that computer-using canines made use of has also been key to allowing individuals to freely use the Internet. This freedom to access or create content regardless of your position in society, or which societies you are part of, has both been an aspect of modern self-development. It has also created concerns that individuals might act with impunity, or particularly for minors, encounter content that is truly harmful. Both law and policy have reacted with approaches centring on verification — different levels of assurance that someone is who they claim to be, or that they have a certain characteristic, such as meeting an age threshold. These provisions interact with an array of different rights and interests, and are currently reshaping people’s experiences online. In this seminar, we will examine all of this, and help you come to your own views about how modern Internet services should function.

Questions to prepare for the seminar

·       What is the difference between age verification and age assurance? What are the consequences of this distinction for policy effectiveness, right and freedoms, and what are the trade-offs?

·       What are the impacts of age verification/assurance on 1) the rights of children and 2) the rights of adults?

·       What are some of the practical challenges in implementing age verification/assurance, and how might, and in your view, should, these be approached?

·       How does the ‘verified’ user regime work? How does this connect to freedom of expression and online environments?

·       Should people on social media be obliged to use their real names? Should they be obliged to lodge their real identities with the platforms for the purposes of, for example, law enforcement access?

Readings

Read Before Seminar  Sonia Livingstone and others, ‘Children’s Rights and Online Age Assurance Systems: The Way Forward’ (2024) 32 The International Journal of Children’s Rights 721.

         This piece blends social science and a consideration of existing and forthcoming policy measured and presents a nuanced view on age verification and assurance technologies in context.

Read Before Seminar  Eric Goldman, ‘The “Segregate-and-Suppress” Approach to Regulating Child Safety Online’ (2025) 28 Stanford Technology Law Review 173.

         This US law review article broadly argues against age verification and/or assurance systems, with a stronger focus on broad freedom of expression than the above reading. There is no need to engage deeply with the details of the varying US state laws referenced, nor the details of US constitutional issues — focus on the policy questions which cut across jurisdictions. Note that this piece was written prior to the Free Speech Coalition, Inc. v. Paxton US Supreme Court case referred to below (which confirmed the constitutionality of these systems in a 6-3 partisan split).

Read Before Seminar  danah boyd, ‘The Politics of “Real Names”’ (2012) 55 Commun. ACM 29.

         A short piece — read in combination with Online Safety Act 2023, s 15 and s 64, and reflect on the differences. Many policymakers have called for a “real name policy” on social media, or require identification to open an account. The policy in the Online Safety Act 2023 follows previous private members bills in Parliament which call for 1) users to be able to verify themselves and 2) users to be able to block those who are ‘unverified’. What do you make of such policies?

Recommended                Victoria Nash, ‘Gatecrashers? Freedom of Expression in an Age-Gated Internet’ in Alistair S Duff (ed), Research Handbook on Information Policy (Edward Elgar 2021).

This piece provides both overarching free expression discussion (complementing Goldman (2025) above) as well as providing some of the UK legal backdrop prior to the Online Safety Act 2023, which is no longer good law.

Recommended                Zahra Stardust and others, ‘Mandatory Age Verification for Pornography Access: Why It Can’t and Won’t “Save the Children”’ (2024) 11 Big Data & Society 20539517241252129.

         You can skim the section on the quantitative auditing method used — focus only on the conclusions of that part.

Further          Martin Sas and Jan Tobias Mühlberg, ‘Trustworthy Age Assurance? A Risk-Based Evaluation of Available and Upcoming Age Assurance Technologies from a Fundamental Rights Perspective’ (The Greens/EFA in the European Parliament, February 2024).

This report considers the technologies in more detail in the context of European law (existing and proposed), which is not considered as much by the readings above.

Further          Molly Buckley, ‘Americans, Be Warned: Lessons From Reddit’s Chaotic UK Age Verification Rollout’ (Electronic Frontier Foundation, 8 August 2025).

         A short blog on some of the implementation controversies surrounding the Online Safety Act’s age verification obligations.

Further          European Commission, ‘Annex to the Communication to the Commission: Approval of the content on a draft Communication from the Commission – Guidelines on measures to ensure a high level of privacy, safety and security for minors online, pursuant to Article 28(4) of Regulation’ ((C(2025) 4794 final, 14 July 2025).

These are guidelines issued under the EU Digital Services Act art 28.

Statute

United Kingdom

Read Before Seminar  Online Safety Act 2023 s 1–3, 6, 7, 11–13, 15 (especially noting 15(9–10)), 35–37, 64.

         Read these carefully! Some of these may take some imagination as to how they might exist in practice, as the law is in its early days and regulatory guidance is in either early iterations or does not exist at all for some parts.

Further          Data Protection Act 2018 s 123 (“Age-appropriate design code”)

Read this alongside the code that relates to it at Information Commissioner’s Office “Age appropriate design: a code of practice for online services” (ICO 2020).

Further          Digital Economy Act 2017 (as enacted), part 3 (“Online Pornography”)

This part, which has been repealed by the Online Safety Act 2023, was a failed regulatory attempt to introduce some age verification in the United Kingdom, and is of interest as background context. It was never substantively commenced for enforcement, although it was commenced for the purposes of trying to create a regulator (the BBFC, in charge typically of assigning age-ratings such as 12A to films) and standards.

European Union

Read Before Seminar  Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1, arts 28, 35(1)(j)

         See under Readings the guidelines under Article 28 that have been issued by the European Commission.

Case Law

United States

Further          Free Speech Coalition, Inc. v. Paxton, 606 U.S. ___ (2025)

In this case, a decision split across the Supreme Court’s ideological lines, the majority argue that an age verification law does not directly regulate protected speech of adults as it “simply requires proof of age to access content that is obscene to minors”, meaning it escapes a strict scrutiny test (that many laws fail on), which requires a state to show it chose the least speech-restrictive method of achieving a policy goal. This in practice means that a maximally burdensome method — as many states have chosen to do, requiring the upload of government ID — can go forward.


 

Seminar 10 Moderating AI: Models and Outputs

Professor Michael Veale

Large language models (LLMs) have fast become a major way through which content is created and accessed. Both accessing and generating such content comes with a significant host of challenges, with some tasks these systems are put to being harmful (such as fraud, or generation of intimate images for abusive purposes) and some output being erroneous and potentially even libellous. The ease and scale with which content can be generated is posing challenges for online platforms’ existing content moderation practices, and potentially also challenging the fine balances established in the legal regimes we have already examined in this module.

 

We are also seeing content become more complex. New platforms have sprung up hosting models, or finetuned versions of them, which are good at, for example, generating specific types of images. These have benign uses, but can also be used maliciously, such as to create deepfakes of well-known individuals, Internet influencers, or simply members of the public. These models themselves are content on model marketplaces, but unlike content such as images or text (where we might think illegality is relatively simple to appraise at face value), malicious models with malicious (and potentially also benign) purposes are challenging hosting intermediaries and showing up fault-lines in the law.

 

Some emerging regulatory regimes are approaching the issue, such as the EU AI Act, which has placed some transparency and watermarking obligations in relation to synthetic content. Will they be enough?

Questions to prepare for the seminar

·       To what extent do the Online Safety Act or the Digital Services Act cover generative AI systems? Consider the scope and definitions of these instruments that we studied in previous sessions.

·       Consider the transparency provisions in the EU AI Act. Might they be effective at regulating issues of synthetic content? If not, why not?

·       How does the generation of content affect existing content moderation practices on online platforms? What might law and policy need to do to react to this?

·       Consider models themselves as content. Is the intermediary liability regime properly set up to deal with these types of content? If not, why not?

Readings

Read Before Seminar  Robert Gorwa and Michael Veale, ‘Moderating Model Marketplaces: Platform Governance Puzzles for AI Intermediaries’ (2024) 16 Law, Innovation and Technology 341.

         This article looks at major AI model hosting platforms such as Hugging Face and Civitai, and looks at what happens when content (such as that protected by intermediary liability shielding) is not an image, video, or text,  but a complex piece of software itself with multiple potential uses. Pay attention to the mix of private governance and legal consequences and draw upon your knowledge of the intermediary liability regime.

Read Before Seminar  Jason Koebler, ‘AI Slop Is a Brute Force Attack on the Algorithms That Control Reality’ (404 Media, 17 March 2025).

         This news article considers how social media is coping with industries flooding it with low quality synthetic content. How might this change the practical task and effort of day-to-day content moderation? Might the underlying rules also need to change?

Read Before Seminar  Sarah A Fisher, Jeffrey W Howard and Beatriz Kira, ‘Moderating Synthetic Content: The Challenge of Generative AI’ (2024) 37 Philosophy & Technology 133.

         This paper looks at the policy and normative arguments concerning the regulation of synthetic content.

Recommended                Reuben Binns and Lilian Edwards, ‘ChatGPT Tells Fibs About Me: Are Data Protection and Libel Adequate Tools to Protect Reputation in the LLM Era?’ in Philipp Hacker and others (eds), The Oxford Handbook of the Foundations and Regulation of Generative AI (Oxford University Press 2025).

This chapter looks at how AI models can contain and reproduce both real and faulty facts relating to natural persons.

Recommended                Samantha Cole, ‘Bing Is Generating Images of SpongeBob Doing 9/11’ (404 Media, 4 October 2023).

This article looks at how difficult it can be for image generation providers to make rules (and automated flagging systems) that distinguish between permissible and impermissible images, even according to policies they design themselves. Why is this? Is this issue likely to go away or to continue?

Further          Philipp Hacker, Andreas Engel and Marco Mauer, ‘Regulating ChatGPT and Other Large Generative AI Models’, Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (Association for Computing Machinery 2023).

This article, relatively early on in the commercialisation of large language models, briefly applies different European legal regimes to AI systems. See especially its analysis and take on the Digital Services Act. Consider how tools like ChatGPT, Meta AI or similar work today. Does this hold true?

Further          Robert Chesney and Danielle Keats Citron, ‘Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security’ (2019) 107 California Law Review 1753.

This early article, before the current generation of image generation systems, looks at the societal risk of ‘deepfake’ technologies from a wide variety of standpoints. If you read this, pay more attention to the social and policy issues than speculation around US law, which we will not focus on here.

Statute

European Union

Read Before Seminar  Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act), art 50 (refer to definitions and scope in arts 1–3 as appropriate).

         These transparency provisions stand out from other provisions in the AI Act and stand relatively alone (although look at the definitions to contextualise them). You may also wish to note that the regulator is a ‘market surveillance authority’, a product authority that will have limited-to-no practical experience of Internet regulation.

Further          European Commission, Code of Conduct on Disinformation (2025). 

         This is a code of conduct under DSA, art 45 — see below.

Further          Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277/1, art 45.

United Kingdom

Further          Crime and Policing Bill (at the time of writing, HL Bill 111) ch 1 (“Child Sexual Abuse”)

This may pass at some point during the module.

Further          Sexual Offences Act 2003 ss 63B –63D

These offences — “Sharing or threatening to share intimate photograph or film” — were added in the Online Safety Act 2023. Note the exemption for internet service providers in 63C, which means that the underlying liability would not shift to the provider, although the obligations to take down under the OSA (and regulatory penalties for failing to have an effective process) may apply.


 

Seminar 11 Privacy Online

Professor Orla Lynskey

In this seminar, we will introduce some of the key concepts that will inform our discussions this term. We will identify the socio-technical changes that have brought new challenges for data protection and privacy rights, primarily digitisation and datafication. We will also introduce some legal frameworks applicable to the processing of personal information at international level and consider their key characteristics, including where they sit on a spectrum between rights-based and market-oriented models. Finally, we will zoom in on the EU legal framework and discuss the fundamental rights that undergird it, principally the rights to data protection and privacy. We will critically consider whether there is — and should be — a legal distinction between rights to data protection and informational privacy rights, and what this might mean in practice.

Questions to prepare for the seminar

·       Does the development of the internet challenge established privacy rights? If so, how?  

·       What social functions does ‘privacy’ play?

·       The CJEU’s judgment in Lindqvist provided a forewarning of some of the major challenges to the application of data protection online. What were they? Why were these difficult for the Court to address? What might have happened if the Court had decided otherwise?

·       How do you think of the distinction between data protection and privacy? Does this distinction help us to address some of the conceptual challenges of applying ‘privacy’ to the internet and related technologies?

Readings

Read Before Seminar  Samuel D Warren and Louis D Brandeis, ‘The Right to Privacy’ (1890) 4 Harvard Law Review 193.

Read Before Seminar  Plixavra Vogiatzoglou and Peggy Valcke, ‘Two Decades of Article 8 CFR: A Critical Exploration of the Fundamental Right to Personal Data Protection in EU Law’, in Eleni Kosta, Ronald Leenes and Irene Kamara (eds) Research Handbook on EU Data Protection Law (Edward Elgar 2022) [preprint link].

Recommended                Neil Richards, ‘The Dangers of Surveillance’ (2013) 126 Harvard Law Review 1934.

Recommended                Julie E Cohen, ‘What Privacy Is For’ (2012) 126 Harvard Law Review 1904.

Further          Bert-Jaap Koops and others, ‘A Typology of Privacy’ (2017) 38 University of Pennsylvania Journal of International Law 483.

Case law

European Union

Read Before Seminar  Case C-101/01 Criminal Proceedings against Bodil Lindqvist EU:C:2003:596.

Statute

European Union

Read Before Seminar  Charter of Fundamental Rights of the European Union [2012] OJ C 326/391 arts 7–8.

Recommended                Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L 119/1 (EU GDPR).

💡 Now would be a good time to print out the GDPR so you are annotating a consistent copy. The neatest version to print is likely the PDF from the Eurlex service, as passed in 2016.

United Kingdom

Recommended                Retained Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L 119/1 (UK GDPR).

Many of the amendments such as those in the Data (Access and Use) Act 2025 are made to the retained version of the GDPR, referred to as the UK GDPR. When you read the UK GDPR, focus on what has been changed. What is different, and why might it have been altered?

Further          Data (Use and Access) Act 2025

Further          Data Protection Act 2018

 

💡🇬🇧 Brexit Note 🇪🇺: UK statutory data protection law is a mess. Here’s what you need to know to navigate it.

The GDPR was passed in the EU in 2016 and entered into force in May 2018 (including in the UK before Exit Day at the end of 2020). On Exit Day, the GDPR (now called the EU GDPR in the UK for clarity) was frozen and passed into UK law directly (as ‘retained law’ by virtue of the European Union (Withdrawal) Act 2018). It was henceforth known as the UK GDPR, and was edited very slightly and largely inconsequentially (e.g. to replace ‘European Commission’ with ‘Secretary of State’ on some occasions). The UK GDPR confusingly has almost the same citation as the EU GDPR — we usually distinguish these type of laws by calling them Retained Regulation [..]. You can find it on the legislation.gov.uk website.

The Data Protection Act 2018 is a different thing entirely. It is not technically a transposition of the GDPR — recall that only Directives are transposed. However, the GDPR contains many ‘opening clauses’ which leave discretion to Member States as to how they are introduced into domestic law. The Data Protection Act 2018 adjusts these opening clauses to the domestic context. For example, the GDPR imposed obligations on Member States to establish regulators with certain powers; as a matter of national administrative law, there needed to be some implementing measures for that. Furthermore, the GDPR allowed Member States to restrict some rights under clearly delimited conditions (e.g. not allowing people to request copies of their exam marks before they are released to everyone), and to also legislate for certain processing to be explicitly allowed (e.g. laying out what is a ‘substantial public interest’, such as collecting sensitive ethnicity data for the purposes of equality analysis). In addition, the Data Protection Act 2018 does transpose the Law Enforcement Directive (data protection for policing) into domestic law and implements the Council of Europe data protection convention, Convention 108+ to apply to intelligence services, which otherwise would be unregulated in some domains. It also contains a few little easter eggs, some of which were inserted in the parliamentary process by amendment, such as the ‘Age Appropriate Design Code’, which is a sort of proto-Online Safety Act regime that acts through data protection law. But broadly, if you are looking for the actual text of data subject rights and data controller obligations in relation to private actors, or most public sector actors outside law enforcement and intelligence–types, you won’t find them in the Data Protection Act 2018 — although you may need to refer to it to understand how they are restricted or modified.

On top of this, the Data (Use and Access) Act 2025, another piece of UK primary law, amended UK data protection law in substantive ways. It mostly did this through directly amending the UK GDPR — the retained regulation — itself. So the retained regulation — secondary law — has been amended by primary law, and is now in some parts quite different from the EU GDPR. That being said, most things, and the fundamental structure are still the same.

Seminar 12 Data Protection’s Scope

Dr Tommaso Fia

In this session, we will discuss the material, personal and territorial scope of the EU data protection framework, paying particular attention to material scope. EU and UK data protection law, like most other data protection frameworks globally, applies to the processing of personal data. However, its boundaries have long been the subject of debate, and are an important point of interaction for law, policy, computer and data science.

McAuley describes why computer scientists find anonymisation difficult to achieve. In her seminal paper, Purtova argues that the CJEU has interpreted the GDPR in an expansive manner which has led to an unmanageable array of information types to be classifiable as personal data (but compare to Dalla Corte, who critiques this reading of the law, and to Pallas and Finck, who provide a more contextually grounded perspective). Elliot and others propose a different approach, which looks at the risk of data to be reidentified in its environment.

Major challenges for data protection ensue. What would be the benefits or risks of adopting this approach? How should a controller in practice go about considering what personal data is or not? Is there a good balance between personal or non-personal data classification that is possible, or will any approach inevitably be gamed and abused? If so, is there a way out of this quandary?

Question to prepare for the seminar

·       What is it about data that makes anonymisation particularly hard?

·       Can you think of a good example of data which is reliably non-personal under data protection law?

·       What sort of test does Breyer imply? What kind of capacities might data controllers need to carry out this test?

·       Does SRB provide more legal certainty as to how it is possible to distinguish between personal and non-personal data?

·       What is the nature of pseudonymous data?

·       Are there issues still left out?

·       Is this approach to personal data a sensible one for data protection law?

·       What might be the challenges if a narrower approach was taken?

·       What about a broader one?

·       Is the household exemption too narrow in scope?

Statute

European Union

Read Before Seminar  GDPR, recitals 26-30; arts 2, 4(1).

         Try to understand the household exemption, and the scope of personal data and processing.

Videos

Read Before Seminar  Video: Derek McAuley, The Anonymisation Problem (Computerphile 2017) length: 10 mins

Further          Latanya Sweeney, Reidentification Risks and the HIPAA Privacy Rule (US Department of Health and Human Services Conference on Data Privacy in the Digital Age, 26 October 2017) first 22 mins

Articles

Read Before Seminar  Nadezhda Purtova, ‘The Law of Everything. Broad Concept of Personal Data and Future of EU Data Protection Law’ (2018) 10 Law, Innovation and Technology 40.

Read Before Seminar  Lorenzo Dalla Corte, ‘Personal Data in the EU Legal System’ in Giovanni Comandè (ed) Elgar Encyclopedia of Law and Data Science (Elgar 2022).

Read Before Seminar  Lorenzo Dalla Corte, ‘Anonymous Data’ in Giovanni Comandè (ed) Elgar Encyclopedia of Law and Data Science (Elgar 2022).

Read Before Seminar  Sophia Stalla Bourdillon, ‘Identifiability, as a Data Risk: Is a Uniform Approach to Anonymisation About to Emerge in the EU?’ (2025) European Journal of Risk Regulation 1.

         Contains useful context about anonymisation methods as well as detailed analysis up to 2025 (but excluding SRB).

Recommended                Michèle Finck and Frank Pallas, ‘They Who Must Not Be Identified—Distinguishing Personal from Non-Personal Data under the GDPR’ (2020) 10 International Data Privacy Law 11.

         Contains useful context about pseudonymisation methods such as hashing and salted hashing, as well as detailed legal analysis up to 2020.

Further          Nadezhda Purtova, ‘From Knowing by Name to Targeting: The Meaning of Identification under the GDPR’ (2022) 12 International Data Privacy Law 163.

Considers whether there is an alternative way to justify personal data, focusing on ideas of singling out and individuation rather than reidentifiability.

Further          Raphaël Gellert, ‘Personal Data’s Ever-Expanding Scope in Smart Environments and Possible Path(s) for Regulating Emerging Digital Technologies’ (2021) 11 International Data Privacy Law 196. UCL typeset link / OA preprint

Further          Yves-Alexandre de Montjoye and others, ‘Unique in the Crowd: The Privacy Bounds of Human Mobility’ (2013) 3 Scientific Reports 1376.

A seminal computer science paper on the futility of anonymising location data.

Further          Information Commissioner’s Office, Anonymisation Guidance (ICO 2025)

Cases

European Union

✏️ Note: There are a lot of cases here — practice going between them and the readings, reading fast and selectively, and skipping the bits that you will already know after reading them once/elsewhere (e.g. the legal setup).

Read Before Seminar  Case C-582/14 Breyer ECLI:EU:C:2016:779

         On the identifiability of personal data

Read Before Seminar  Case C‑434/16 Nowak ECLI:EU:C:2017:994

         On exam scripts and comments, and whether opinions are or are not personal data.

Read Before Seminar  Case C‑413/23 P EDPS v SRB ECLI:EU:C:2025:645

         On the identifiability of personal data and ‘relative’ nature of pseudonymised data. The ‘P’ indicates it is an appeal from the General Court [a T- case] — you may not have seen this before.

Recommended                Case C-101/01 Criminal Proceedings against Bodil Lindqvist EU:C:2003:596

On the scope of the household exemption.

Recommended                Case C212/13 Ryneš ECLI:EU:C:2014:2428.

On a CCTV camera on a house and the household exemption.

Further          Joined Cases C-141/12 and 372/12 YS and Others EU:C:2014:2081.

On the distinction between data and documents in relation to the scope of personal data.

Further          Case C-479/22 P OC v European Commission ECLI:EU:C:2024:215

On tests for re-identifiability.

Further          Case C-604/22 IAB Europe ECLI:EU:C:2024:214

On the information that a cookie banner sends about the consent options selected.

Further          Case C-659/22 Ministerstvo zdravotnictví (Application mobile Covid-19) ECLI:EU:C:2023:745

On the scanning of a vaccination certificate being personal data processing.

Further          Case C446/21 Schrems (Communication de données au grand public) ECLI:EU:C:2024:834

On whether a statement by a person about his or her sexuality can constitute data ‘manifestly made public’ under the GDPR art 9, whether that can be relied upon by a third party collecting different data, and the extent of data collection permitted for advertising under data protection principles.

United Kingdom

Further          R (on the application of Edward Bridges) v The Chief Constable of South Wales Police [2019] EWHC 2341 (Admin) [109]–[127].

This case concerns a situation where an individual walked past a facial recognition camera set up by South Wales Police; the relevant part to us is the part where the police claim that the claimant (who was not flagged against a watch list) is within scope of data protection law, because the data kept about him was immediately deleted upon not being matched, did not leave the physical camera device, and therefore could not meet the Breyer test as it had not prospect of being reasonably matched to him, given that he was not on the watch list. The High Court called the Breyer test “artificial and unnecessary" in this sisutation, and created a strand of case law based instead on individuation possibility and system purpose. Note, that this case was appealed but South Wales Police dropped the challenge on the personal data scope question and so this was not considered by the Court of Appeal.

Seminar 13 Data Protection’s Substance

Professor Michael Veale

The broad scope of data protection law is tempered to some extent by the law’s legitimising force. Once within the scope of the law, data processing is legitimate if it, first, has a legal basis and, second, it complies with the principles of personal data processing (sometimes known as Fair Information Practice Principles or FIPPs). The most famous of the six legal bases is consent. However, it is a common misconception that consent is required for personal data processing — data controllers can also rely on other legal bases to justify data processing, including legitimate interests and the public interest. Moreover, even where data processing has a legal basis, it must comply with all the data processing principles, which include data security, fairness and proportionality-based principles. In practice, these principles can have a significant disciplining effect on data processing. In this seminar, we will debate whether data protection law gets the balance right between protecting fundamental rights and legitimising data processing.

Questions to prepare for the seminar

✏️ Note: This session is statute heavy: please spend significant time with the GDPR and read the relevant articles and recitals in detail and note down any uncertainties and tensions you find.

·       What does data protection oblige data controllers to do?

·       What are the legal bases for processing? Can you think of examples where each of them would be used? Can you think of examples where choosing between them might be difficult?

·       What are the practical consequences of the conditions for, and definition of, consent?

·       Who decides what is, and is not, a legitimate interest? Can’t everything be processed as a legitimate interest? If not, why not?

·       What data would fall clearly into being ‘special category’ data. What data that nonetheless might be sensitive would not fall into this category?

·       What are the consequences of ‘data protection by design and by default’?

Statute

Read Before Seminar  EU GDPR, recitals 32, 39–58, 78, 89–96, arts 5-11, 24, 25, 33–36, 13–14.

         These recitals and articles broadly relate to the legal bases for processing and the main obligations for controllers that need to be undertaken regardless of whether the data subject has requested them to occur or not.

Cases

European Union

Read Before Seminar  Case C-252/21, Meta Platforms and Others ECLI:EU:C:2023:537

         On the appropriate legal basis for online profiling to subsidise the provision of ‘free’ online services. Important for consent, legitimate interests and contract.

Read Before Seminar  Case C-184/20 OT ECLI:EU:C:2022:601

         On conditions for inference of special category data. Introduces a new legal test — data that are capable of revealing [special category information] of a natural person by means of an intellectual operation involving comparison or deduction. Sometimes known as Vyriausioji tarnybinės etikos komisija but this is likely hard to remember.

Read Before Seminar  Case C-492/23 Russmedia Digital ECLI:EU:C:2025:935

         For the purpose of this seminar, consider what is special category data here, and what obligations it entails.

Further          Case C-673/17, Planet49 ECLI:EU:C:2019:801

         Interprets the meaning of consent under the GDPR

Further          Case C-13/16, Rigas Satiksme ECLI:EU:C:2017:336

On the meaning and application of legitimate interests balancing test.

Readings

Recommended                Chris Jay Hoofnagle and others, ‘The European Union General Data Protection Regulation: What It Is and What It Means’ (2019) 28 Information & Communications Technology Law 65.

Recommended                European Data Protection Board (EDPB) ‘Opinion 08/2024 on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms’ (EDPB 2024).

Opinion of the EDPB on ‘pay or okay’ and the application of the Meta Platforms judgment in practice

Further          Jef Ausloos, ‘Foundations of Data Protection Law’ in The Right to Erasure in EU Data Protection Law: From Individual Rights to Effective Protection (Oxford University Press 2020).

 

Seminar 14 Data Rights and Wrongs

Professor Orla Lynskey

Data protection isn’t just about obligations. A core aspect of the regime can be found in the rights it grants to data subjects in relation to personal data, ‘micro-rights’ that in the EU context could be said to flow from the larger fundamental right to data protection.  One theory underpinning these rights is that they give individuals control over their personal data (sometimes referred to as ‘informational self-determination'). Another related perspective is that they enable individuals to play an active role in safeguarding their own fundamental rights when their personal data are processed. In the EU, these rights are relatively well-known and frequently exercised. Nevertheless, their scope and meaning is under constant appraisal and there has been a lively debate in the US about whether it is desirable to introduce similar rights in that jurisdiction. One of the key queries is whether it is normatively desirable or practically feasible for individuals to play a role in safeguarding data relating to them. We will discuss these issues in this seminar.

Questions to prepare for the seminar

Readings

Read Before Seminar  Jef Ausloos and Michael Veale, ‘Researching with Data Rights’ [2020] Technology and Regulation 136.

         This article looks at a non-conventional use of data rights — their use in research, in particular to hold certain actors, including large internet platforms, to account. In doing so, it discusses the scope and limitations of access rights in particular. Do you think this is a legitimate use of data rights?

Read Before Seminar  Margot Kaminski, ‘The Case for Data Privacy Rights (Or 'Please, a Little Optimism')’ (2022) 97 Notre Dame Law Review Reflection 356.

Read Before Seminar  Ari Ezra Waldman, ‘Privacy's Rights Trap’ (2022) 117 Northwestern University Law Review 88.

Recommended                René Mahieu, ‘The Right of Access to Personal Data: A Genealogy’ [2021] Technology and Regulation 62.

This article looks at the background behind the introduction of the right of access. In particular, it looks at the philosophy of one of the ‘godfathers’ of data protection law, Italian jurist Stefano Rodotà. It argues that data access was designed as a form of counter power, and was envisaged to have collective dimensions.

Recommended                European Data Protection Board, ‘Guidelines 01/2022 on data subject rights - Right of access’ (EDPB 2023).

These are the regulatory guidelines on access rights produced by the EDPB. They are non-binding but can be persuasive for both regulators and courts.

Statute

European Union

Read Before Seminar  EU GDPR, recitals 57–73, arts 11–21, 23.

United Kingdom

Further          UK GDPR, art 15(1A)

A new qualification to the extent of search required for access requests inserted by the Data (Use and Access) Act 2025.

Further          Data Protection Act 2018, sch 2 part 1.

This contains exemptions to data rights that were set down on the basis of the EU GDPR, art 23 (Restrictions). Since Brexit there is now no need to rely on article 23 due to parliamentary sovereignty, although reference to it may be useful in relation to securing an ‘adequacy’ agreement with the European Union for international data transfers.

Case Law

European Union

Read Before Seminar  Case C-131/12 Google Spain SL and Google Inc v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González ECLI:EU:C:2014:317.

         This case saw the birth of a right that was not explicitly laid out in statute in the previous regime, the Data Protection Directive 1995. It allows individuals to, under certain conditions, have themselves ‘delisted’ from search engines. Colloquially it has become known as the ‘right to be forgotten’. In the GDPR, the ‘right to erasure’ was introduced too; although this has more nuanced and explicit statutory structure. It is thought the two rights — the ‘right to be forgotten’ from Google Spain and the right to erasure from the GDPR now coexist, although their exact interaction is unclear, largely because search engines get special, judge-made treatment as a special type of data controller.

Recommended                Case C136/17 GC and Others v Commission nationale de l’informatique et des libertés (CNIL) ECLI:EU:C:2019:773.

This case elaborates on the special treatment search engines get and how it affects other parts of the GDPR regime — in particular, article 9 ‘special category data’, which it turns from a legal basis into what almost looks like another data right, triggered by a data subject rather than carried out ex ante.

Further          Case C434/16 Nowak ECLI:EU:C:2017:994.

On the broad scope of the right of access.

Further          Case C154/21 RW v Österreichische Post AG ECLI:EU:C:2023:3.

On the right to know ‘recipients’ of personal data, and that requests have to be tailored to the individual if required, not generic. See also in English law Harrison v Cameron & Anor [2024] EWHC 1377 (KB).

Further          Joined Cases C141/12 and C372/12 YS and Others ECLI:EU:C:2014:2081.

On the distinction between data and documents.

Further          Case C487/21 FF v Österreichische Datenschutzbehörde and CRIF ECLI:EU:C:2023:369

Clarifying YS and Others that individuals have a right to database extracts or documents if the provision of this copy is necessary to exercise rights.

Further          Case-307/22 FT (Copies du dossier médical) ECLI:EU:C:2023:811

Purpose-blind nature under post-Brexit EU case law - ‘the controller is under an obligation to provide the data subject, free of charge, with a first copy of his or her personal data [..] even where the reason for that request

United Kingdom

Recommended                NT1 & NT2 v Google LLC [2018] EWHC 799 (QB).

The first Google Spain RTBF case applied by English courts. One applicant had his delisting refusal by Google overturned by the High Court, and one had it confirmed. What was the difference between them?

Further          Dawson-Damer & Ors v Taylor Wessing LLP [2017] EWCA Civ 74 at [104]–[108] and B v The General Medical Council [2019] EWCA Civ 1497 at [79].

On ‘purpose-blind’ access rights - ‘the general position is that the rights of subject access to personal data […] are not dependent on appropriate motivation on the part of the requester.

Further          Harrison v Cameron & Anor [2024] EWHC 1377 (KB)

Confirms essentially that the same tailored right to access specific data recipients in Case C‑154/21 RW v Österreichische Post AG ECLI:EU:C:2023:3 is also a feature of post-Brexit UK law.

European Court of Human Rights

Further          Hurbain v Belgium ECLI:CE:ECHR:2023:0704JUD005729216

This case was an Article 10 ECHR claim by the Belgian newspaper Le Soir, which had been ordered to anonymise a digital archived version of a 1994 article about a driving offence. No breach of article 10 was found.

Further          Gaskin v United Kingdom (1990) 12 EHRR 36.

Case law linking a right of access to right to respect for private and family life (Article 8) and right to receive information (Article 10); found breach in respect to the UK when documents were not provided.

Seminar task — accessing your data

For this topic, there is a short preparatory activity we will use as a basis for discussion about the right of access. Carrying out steps 1-2 are essential, but step 3 is optional (writing further to a company to request your data). We advise doing it, however, as it is free, and interesting to see how they interact with you, and what you get back.

1.     Try to use a ‘download my data’ tool on one or more Internet services that you use. Some examples of those from the largest services are linked here: Facebook; X/Twitter; Instagram; Amazon; Apple; Google; Microsoft; TikTok; OpenAI.

2.     If you get a copy of the data, try to locate the privacy policy for that company. Privacy policies are usually implemented as to provide some of the information required in Article 13 of the GDPR (have a look). Re-read articles 13–15, and article 20, of the GDPR. Is this all the data you requested? What personal data might be missing? Do you also think (e.g. from your usage of a service) that the controller might have additional data about you they are not telling you about?

3.     [Optional, but recommended] Write an email or other message to the firm (how should be detailed in the privacy policy) asking for a full copy of your data, highlighting any omissions you discovered or suspected based on point 2. Michael has made a template for a very full version of how to do this here, but you are welcome to just type something shorter or more focussed.

If you get to 2 and 3, save all the files you get from this process in one place (e.g. a copy of relevant parts of the privacy policy, and any data), so we can talk about the process during the tutorial. We won’t be collecting them but you’ll be invited to share any specific findings or aspects of them you are comfortable. Regardless, save and look at any data you receive. We’ll be thinking about it later in the year.

Questions to consider when doing/reflecting on this task: What is the right of access for? What is its scope? What are its limits? - How powerful is the right of access at achieving its various purposes? What barriers exist to make access rights useful or powerful? What barriers did you face getting access to, or scrutinising data? How might you reform the right of access to make it more useful — or is such reform futile?

Seminar 15 Online Tracking: Cookies and Controllers

Professor Michael Veale

 

🎼 Somebody is prying through your files, probably
Somebody's hand is in your tin of Netscape magic cookies
But relax: if you're an interesting person
Morally good in your acts
You have nothing to fear from facts

The Age of Information (Momus, 1997)

Internet law has regularly clashed with online tracking for advertising and purposes beyond. It hasn’t always gone well for the law. An industry proliferates today with genuine questions about its legality. What are cookies? Should we be concerned about them? What kind of information is involved in online tracking, and how Is it used? Understanding online tracking helps us to understand everything from power online, corporate and state surveillance, and stresses complex concepts in data protection law such as controllership.

Questions to prepare for the seminar

-       What conceptual roles do cookies play in online tracking and profiling?

-       Consent is a large part of the way that we govern cookies in Europe. But is it a good way to do this? What would the alternatives look like, and in what ways would they be better or worse?

-       If many of the ways the cookies are used on the Internet are currently illegal, why hasn’t anything been done about it? What are the main challenges and regulatory hurdles that prevent affective enforcement?

-       What harms do excessive tracking cause? If you’re okay with getting targeted products, is the extent of tracking present online today okay to keep?

Throughout, pay attention to the issue of data controllership. Who is the controller when it comes to tracking on a website? Why is it conceptually hard to work out? Where should responsibility fall?

Articles

Read Before Seminar  Michael Veale and Frederik Zuiderveen Borgesius, ‘Adtech and Real-Time Bidding under European Data Protection Law’ (2022) 23(3) German Law Journal 226.

         Argues that real-time bidding is difficult if not impossible to reconcile with data protection law.

Recommended                European Data Protection Board, ‘Opinion 08/2024 on Valid Consent in the Context of Consent or Pay Models Implemented by Large Online Platforms’ (EDPB 2024).

Recommended                Information Commissioner’s Office, ‘Consent or Pay’ (ICO, 10 July 2025).

Recommended                Catherine Armitage and others, Study on the Impact of Recent Developments in Digital Advertising on Privacy, Publishers and Advertisers: Final Report (Publications Office of the European Union 2023) pages 66-103 only.

Recommended                Lee McGuigan and others, ‘The After Party: Cynical Resignation in Adtech’s Pivot to Privacy’ (2023) 10 Big Data & Society.

Further          Michael Veale, Midas Nouwens and Cristiana Santos, ‘Impossible Asks: Can the Transparency and Consent Framework Ever Authorise Real-Time Bidding After the Belgian DPA Decision?’ (2022) 2022 Technology and Regulation 12.

Further          René Mahieu and Joris Van Hoboken, ‘Fashion-ID: Introducing a Phase-Oriented Approach to Data Protection?’ (European Law Blog, 30 September 2019).

Further          Information Commissioner’s Office, ‘Update Report into Adtech and Real Time Bidding’ (20 June 2019).

Further          Information Commissioner’s Office, ‘Data Protection and Privacy Expectations for Online Advertising Proposals’ (25 November 2021).

Statute

United Kingdom

Read Before Seminar  Privacy and Electronic Communications (EC Directive) Regulations 2003, reg 6

         This derives from the e-Privacy Directive, art 5(3), below.

European Union

Recommended                EU GDPR, arts 4(7), 26.

On controllers and joint controllers.

Recommended                Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) OJ L 201/37, art 5(3).

This is implemented in the UK by PECR reg 6, above

Cases

European Union

Read Before Seminar  Case C-49/17 Fashion ID ECLI:EU:C:2019:629.

         On whether a website is a controller with a tracker on the website (a Facebook Pixel), and if so, which part of the processing are they a controller in relation to.

Recommended                Case C-210/16 Wirtschaftsakademie Schleswig-Holstein ECLI:EU:C:2018:388.

         On whether the owner of a Facebook fanpage is a joint controller with Facebook.

Recommended                Case C-604/22 IAB Europe ECLI:EU:C:2024:214.

         On the scope of personal data and consent preferences from cookie banners.

Recommended                Case C252/21 Meta Platforms and Others ECLI:EU:C:2023:537

Particularly concerning the interaction of web tracking data and Article 9 sensitive data characteristics.

Further          Case C25/17 Jehovan todistajat ECLI:EU:C:2018:551.

Consider how the controllership argument might analogise

Further          Case C-673/17 Planet49 GmbH ECLI:EU:C:2019:801.

Establishes the Data Protection Directive 1995 has a similar high standard for consent as the GDPR does.

Further          Case C-131/12 Google Spain ECLI:EU:C:2014:317.

Establishes a search engine can be a controller of personal data (the search index, when queried by name)

 

💡 For both the compulsory cases it can be useful to (additionally) read the Advocate General opinions if you are unclear about the points and context. Remember, the Court does not always agree with these, so if you cite them when writing, use them only analytically and in light of the judgment which followed, not as binding law!

Recommended                Case C210/16 Wirtschaftsakademie Schleswig-Holstein GmbH ECLI:EU:C:2017:796, Opinion of AG Bot.

Recommended                Case C-49/17 Fashion ID GmbH & CoKG v Verbraucherzentrale NRW eV ECLI:EU:C:2018:1039, Opinion of AG Bobek.

Spot the Star Wars reference.

United Kingdom

Read Before Seminar  RTM v Bonne Terre Ltd & Hestview Ltd [2025] EWHC 111 (KB)

This is an incredibly interesting case about how a company that knows a lot about a data subject may have a different threshold for understanding whether consent from that individual was truly freely given.

 

These cases below are not directly about the substance of tracking but consider the issues of a workaround concerning cookie and the ability to claim damages on that basis. They are included here mainly for completeness, but are also of interest (both the final judgments listed and the prior cases appealed) in relation to how courts understand issues of tracking.

Further          Vidal-Hall v Google Inc [2015] EWCA Civ 311.

Further          Lloyd v Google LLC [2021] 3 UKSC 50.

Belgium

Further          (Regulator Decision) Belgian Data Protection Authority, ‘Decision on the Merits 21/2022 of 2 February 2022, Complaint Relating to Transparency & Consent Framework (IAB Europe), DOS-2019-01377’ (2 February 2022).

Summarised and analysed in Veale, Nouwens and Santos (2022) above.

Seminar 16 International Transfers of Personal Data

Professor Orla Lynskey

 

Surveillance law meets data protection law in the world of data transfers. Information moves easily, but as we’ve seen, as it crosses borders it becomes vulnerable to use and abuse by different governments. See the photos of the undersea data cables entering the US (and likely being tapped by the NSA) as photographed by artist Trevor Paglen here. Legal spotlights have been particularly bright on EU-US data transfers, following complaints by Max Schrems against Facebook in Ireland after the Snowden revelations in 2013. This strategic litigation has ended in the striking down of two international transfer agreements, and leaves other contractual mechanisms used for international transfers on shaky ground. In this session, we’ll be looking at the intersection of national security law and the Charter in light of data transfers.

Questions to prepare for the seminar

Readings

Read Before Seminar  Christopher Kuner, ‘Reality and Illusion in EU Data Transfer Regulation Post Schrems’ (2017) 18 German Law Journal 881.

Read Before Seminar  Anupam Chander and Paul M Schwartz ‘Privacy and/or Trade’ (2023) 90 University of Chicago Law Review 49.

Read Before Seminar  Theodore Christakis, ‘After Schrems II: Uncertainties on the legal basis for data transfers and constitutional implications for Europe’ (European Law Blog, 21 July 2020).

Recommended                Gloria González Fuster, ‘Un-mapping Personal Data Transfer’ (2016) 2(2) European Data Protection Law Review 160-168.

Recommended                Douwe Korff, ‘The inadequacy of the October 2022 new US Presidential Executive Order on Enhancing Safeguards For United States Signals Intelligence Activities’ (2022)

Recommended                Christopher Kuner, ‘Schrems II Re-Examined’ (Verfassungsblog, 25 Aug 2020)

Read alongside Douwe Korff, ‘Comments on Prof. Chris Kuner’s Blog Schrems II Re-Examined of 25 August 2020’ (26 August 2020) OA-ish link (alt link)

Recommended                Privacy International, ‘Secret Global Surveillance Networks’ (Privacy International, 2018).

Further          Barbara Sandfuchs, ‘The Future of Data Transfers to Third Countries in Light of the CJEU’s Judgment C-311/18 – Schrems II’ (2021) GRUR Int ikaa204. UCL link

Further          Andrew D Murray, ‘Data Transfers between the EU and UK Post Brexit?’ (2017) 7 International Data Privacy Law 149 OA link

Further          Graham Greenleaf, ‘Japan: EU Adequacy Discounted’ (2018) 155 Privacy Laws & Business International Report 8-10.

Further          Svetlana Yakovleva, ‘Personal Data Transfers in International Trade and EU Law: A Tale of Two Necessities’ (2020) 21 Journal of World Investment & Trade 881

Further          European Data Protection Board, ‘Recommendations 01/2020 on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data, Version 2’ (18 June 2021).

Read the account of the saga and related blog posts by Max Schrems’ NGO at noyb.eu here.

 

 

Statute

European Union

Read Before Seminar  EU GDPR, ch V.

Recommended                Commission Implementing Decision (EU) 2021/1773 of 28 June 2021 pursuant to Directive (EU) 2016/680 of the European Parliament and of the Council on the adequate protection of personal data by the United Kingdom (notified under document C(2021) 4801) OJ L360/69.

Draft renewal of the decision available here.

United Kingdom

Further          Data Protection Act 2018, ss 17–18.

Further          UK GDPR, ch V.

Cases

European Union

Read Before Seminar  Case C-311/18 Data Protection Commissioner v Facebook Ireland and Schrems ECLI:EU:C:2020:559 (“Schrems II”).

Recommended                Case C-362/14 Maximillian Schrems v Data Protection Commissioner ECLI:EU:C:2015:65 (“Schrems I”)

It might be useful to read the AG Opinions in both cases too. These are listed below.

Further          Case C‑362/14 Maximillian Schrems v Data Protection Commissioner ECLI:EU:C:2015:627, Opinion of Advocate General Bot.

Further          Case C-311/18 Data Protection Commissioner v Facebook Ireland and Schrems ECLI:EU:C:2019:1145, Opinion of Advocate General Saugmandsgaard Øe.

Recommended                T-553/23 Latombe v Commission ECLI:EU:T:2025:831

This is a General Court case — it is on appeal to the Court of Justice as C-703/25 P.

Ireland

Further          The Data Protection Commissioner v Facebook Ireland Ltd & Anor [2017] IEHC 545 (Ireland)

This case is very useful in restating the facts of data transfers in the context of Facebook and US surveillance under FISA 702 and EO 12333. It is the case which led to the questions being referred to the CJEU in Schrems II.

 

Seminar 17 Emerging Models for Data Governance

Dr Tommaso Fia

This week’s lecture concerns how data is or may be governed according to different models. Broadly speaking, governance models may be market-based or grounded in different normative rationales that stress a more collective dimension of data processing (as shown by Viljoen). Naturally, different models exhibit different features based on how they are implemented and on the problems they are meant to address. Among other things, the ‘novel, European way of data governance’ (Data Governance Act, recital 32) revolves around data intermediation services (as shown by Carovano and Finck). Yet, other governance models have emerged as a result of bottom-up initiatives (as discussed by Lopez Solano and others and Micheli and others), as well as within urban locales (Bass and Old). A key challenge for these models is that they might find it difficult to scale beyond local or experimental settings.

Questions to prepare for seminar

-               What is ‘data governance’?

-               What is the ‘novel, European way of data governance’ (Data Governance Act, recital 32)?

-               What is a ‘data intermediation service’ under the Data Governance Act?

-               What obligations are data intermediation service providers subject to?

-               What do ‘services of data cooperatives’ cover under the Data Governance Act?

-               What are the main models for ‘alternative’ data governance discussed by Lopez Solano and others? What makes them different from the dominant, market-oriented approaches to data governance?

-               What is a data commons? Why is it difficult to imagine that such a governance model will scale up?

-               Has the data commons established in Barcelona in the context of the DECODE Project brought real change? What are its strengths and weaknesses?

Readings

Read Before Seminar  Salomé Viljoen, ‘A Relational Theory of Data Governance’ (2021) 131 Yale Law Journal 573.

Read Before Seminar  Joan Lopez Solano and others, ‘Governing Data and Artificial Intelligence for All: Models for Sustainable and Just Data Governance’ (European Parliament 2022) (only pp 6-47).

Read Before Seminar  Gabriele Carovano and Michèle Finck, ‘Regulating Data Intermediaries: The Impact of the Data Governance Act on the EU’s Data Economy’ (2023) 50 Computer Law and Security Review 1.

Recommended                Marina Micheli and others, ‘Emerging Models of Data Governance in the Age of Datafication’ (2020) 7 Big Data and Society 1.

Recommended                Charlotte Ducuing, Gijs van Maanen and Tommaso Fia, ‘Data Commons’ (2024) 13 Internet Policy Review 1.

Recommended                Tommaso Fia and Gijs van Maanen, ‘Through Thick and Thin: Data Commons, Community and the Struggle for Collective Data Governance’ [2025] Technology and Regulation 114.

Recommended                Fernando Fernandez-Monge and others, ‘Reclaiming data for improved city governance: Barcelona’s New Data Deal’ (2023) 61 Urban Studies 1291. UCL typeset link

Further          Linnet Taylor and others, ‘Governing Artificial Intelligence Means Governing Data: (Re)Setting the Agenda for Data Justice’ [2025] Dialogues on Digital Society 1.

Further          Tommaso Fia, ‘An Alternative to Data Ownership: Managing Access to Non-Personal Data through the Commons’ (2021) 21 Global Jurist 181.

Further          Theo Bass and Rosalyn Old, ‘Common Knowledge: Citizen-led Data Governance For Better Cities’ (Decode Project, 2020).

 

Statute

European Union

Read Before Seminar  Regulation (EU) 2022/868 of the European Parliament and of the Council of 30 May 2022 on European data governance and amending Regulation (EU) 2018/1724 (Data Governance Act) OJ L, ch III and recitals 27-47


 

Seminar 18 Automated Decision-Making

Professor Michael Veale

In this session, we're going to look at automated decision-making systems and how they're regulated. The regulation of these systems is far from new: it draws on French law from the late 1970s and has been transmitted onwards through a wide variety of regimes and in different forms up to the present day.

More recently, as countries have updated data-protection and privacy laws, this framework, which has its roots in French administrative law, has spiralled into national statute books across many countries. These laws broadly regulate solely automated, significant decision-making. They have a particular structure, and that structure does not always fit neatly or play nicely with the real world. We'll examine the legal structure and some of the major war stories that have defined the field of algorithmic decision-making.

More recently, case law has started to emerge on these provisions. Despite being decades old, they were almost never litigated until fairly recently, when the issues they concern became more relevant to almost all of us. However, are these provisions really what we need to secure a just digital future? What kinds of rights and remedies are needed in a world of runaway Kafka-esque decision-making?

Questions to prepare for the seminar

-               What kind of automated decision systems exist on the internet? Have you encountered any? Would they meet the Article 22 conditions?

-               What are some of the structural issues with Article 22? How might a court go about resolving them?

-               What kind of “right to an explanation” exists in data protection law? Is it useful? If so, under what conditions?

Readings

Read Before Seminar  Reuben Binns and Michael Veale, “Is that your final decision? Multi-stage profiling, selective effects, and Article 22 of the GDPR” (2021) 11 International Data Privacy Law 4, 319–332.

         This piece examines algorithmic systems in real contexts and argues ADM laws do not apply well to them.

Recommended                Margot E Kaminski and Jennifer M Urban, “The Right to Contest AI” (2021) 121 Columbia Law Review 7,1957–2048.

         This article looks at what it would be to contest an AI system, and how different laws are set up.

Recommended                Article 29 Working Party, “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (WP251rev.01)” (6 February 2018).

These are the guidelines on the automated decision-making provisions by the Article 29 Working Party, which was the name for the loose group of regulators (established under Article 29 of the Data Protection Directive 1995). The GDPR formalised this group and gave it legal personality, calling it the European Data Protection Board. During the period between the GDPR passing (2016) and becoming enforceable (2018), the Article 29 Working Party created guidelines for the new law. These were given the stamp of approval by the EDPB so became recognised as guidance from the EDPB. They do not bind regulators or courts, but they are influential in enforcement strategies and as a form of persuasive ‘soft law’. Note that these guidelines came before the CJEU clarifications of Article 22 and related provisions in the cases below.

Further          Rebecca Crootof, Margot Kaminski & Nicholson Price II, “Humans in the Loop” (2023) 76 Vanderbilt Law Review 2, 429.

This is a useful piece critiquing the ‘human in the loop’ narrative around many automated decision laws.

Statute

European Union

Read Before Seminar  EU GDPR, recital 71, arts 13(2)(f), 14(2)(g), 15(1)(h), art 22.

         These are the core automated decision-making provisions in the GDPR.

Read Before Seminar  Directive (EU) 2024/2831 of the European Parliament and of the Council of 23 October 2024 on improving working conditions in platform work OJ L, arts 9–11.

         This is a new EU data protection law that applies only to “persons performing platform work” for “digital labour platforms” — such as Uber, Deliveroo, etc. We won’t focus on the material scope in terms of which workers exactly are covered — it gets quite complex — but the EU has effectively passed a new version of GDPR, art 22 for certain kinds of decisions, with some changes to the scope and obligations of the right. What do you think?

Further          Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act), art 86.

A residual right to algorithmic explanation for individuals affected by certain ‘high-risk systems’ governed by the EU AI Act, applicable only to the extent that other EU law does not provide such a right.

United Kingdom

Read Before Seminar  UK GDPR, arts 22A–22D.

         These are changes that were brought in with the Data (Use and Access) Act 2025.

Case law

European Union

Read Before Seminar  Case C-634/21 SCHUFA Holding (Scoring) ECLI:EU:C:2023:957

This case deals with some of the issues that were argued in Binns & Veale (2021).

Read Before Seminar  Case C-203/22 Dun and Bradstreet Austria ECLI:EU:C:2025:117

         This is a major case from the CJEU around algorithmic explanations.

Further          Joined Cases C‑511/18, C-512/18 and C‑520/18 La Quadrature du Net and Others ECLI:EU:C:2020:791 [181]–[182].

This is a complex case that relates to data retention — effectively a state telling a telecoms firm to hold onto the detailed data is has about users so that it can be used later by state surveillance entities if needed. However, of legal interest in relation to automated decision-making is the way the CJEU reads certain language around restrictions on solely automated, ‘decisive[]’ measures based not on the GDPR (not the law relevant to this challenge) but instead directly on the Charter of Fundamental Rights, effectively reading a form of the automated decision-provisions from the heart of EU law.

Seminar 19 Language Models & Personal Data

Professor Orla Lynskey & Professor Michael Veale

Questions for reading

-       Under what conditions would an AI model qualify as personal data under the GDPR? What problems does this create?

-       Is it legal to scrape personal data from the Web under data protection law? What are the main aspects that make authorisation difficult, and what are the main tensions this creates in relation to the training of large language models?

-       Do you think data protection law as it stands is the suitable legal mechanism for artificial intelligence? If not, why not?

Readings

Read Before Seminar  European Data Protection Board, ‘Opinion 28/2024 on Certain Data Protection Aspects Related to the Processing of Personal Data in the Context of AI Models’ (EDPB, 17 December 2024).

         This is a result of difficult questions posed by the Irish DPA under Art 64(2) to the EDPB, a mechanism which puts the EDPB on the clock to make guidance on the issues.

Read Before Seminar  Reuben Binns and Lilian Edwards, ‘Reputation Management in the ChatGPT Era’, Oxford Handbook on the Foundations and Regulation of Generative AI (Oxford University Press 2025) section 3.2.

Recommended                Michal Gal and Orla Lynskey, ‘Synthetic Data: Legal Implications of the Data-Generation Revolution’ (2023) 109 Iowa Law Review 1087.

Recommended                Michael Veale, Reuben Binns and Lilian Edwards, ‘Algorithms That Remember: Model Inversion Attacks and Data Protection Law’ (2018) 376 Phil. Trans. R. Soc. A 20180083.

Further          Information Commissioner’s Office, ‘Information Commissioner’s Office response to the consultation series on generative AI’ (ICO, 10 December 2024).

This is to be read with the five calls for consultation it collectively responds to, each of which contains further analysis (with colourful diagrams) and is yet to be consolidated, on lawful basis, purpose limitation, accuracy, individual rights, and the allocation of controllership.

Further          Jason Koebler, ‘Not Just “David Mayer”: ChatGPT Breaks When Asked About Two Law Professors’ (404 Media, 2 December 2024).

Further          The Hamburg Commissioner for Data Protection and Freedom of Information, ‘Discussion Paper: Large Language Models and Personal Data’ (Datenschutz Hamburg, 15 July 2024).

This paper stirred a lot of discussion in mid-2024 as coming down extremely strongly against language models as being classified as personal data under any circumstances — the ICO and EDPB papers above disagree with it (and the EDPB paper, adopted under Art 64(2) GDPR, will likely be very influential in cross-border opinions and cases in future as well as the CJEU, although it does not yet finally bind Member State authorities).

Cases

European Union

Read Before Seminar  Case C136/17 GC and Others v Commission nationale de l’informatique et des libertés (CNIL) ECLI:EU:C:2019:773.

         In this case, the CJEU interprets what appears to be an unambiguous, strict obligation in thr GDPR to be a little less than that, given the ‘responsibilities, powers and capabilities’ of the controller. Will the same thing happen with generative AI?

Read Before Seminar   

 


 

Seminar 20 Can Law Change The Internet?

Professor Orla Lynskey

 

Far from the unregulated Wild West that commentators would sometimes have us believe, the Internet – and social activity on the Internet – is now often highly regulated. As the digital rulebook expands further, overlaps and sometimes clashes between legal frameworks becoming increasingly more likely. In this final seminar, we will examine a recent example of the interaction between the intermediary liability rules (studied in Term 1) and the data protection rules (examined in Term 2).

 

Close examination of the Russmedia Digital SRL judgement will allow us to grapple with some of the most challenging doctrinal aspects of data protection law as well as to reflect on the some of the specificities of Internet regulation and the challenges on the horizon.

 

To prepare for the seminar

The first part of this seminar will be organised as a Moot Court. Students will be divided into three groups: Russmedia Digital SRL, X (the victim of the advertisement) and the European Commission (intervening in the proceedings). Based on the facts of the Russmedia case (as set out in the judgement – paras 30 to 42, and the Opinion of the AG – paras 19 to 35). You are asked to consider three key questions for the purposes of the proceedings:

 

1.     Is Russmedia Digital SRL a controller or a joint controller for the purposes of the GDPR?

2.     Should it be found that Russmedia Digital SRL is a data controller for GDPR purposes, what are its primary obligations pursuant to the GDPR and how do these obligations apply to this factual scenario? Can Russmedia Digital SRL limit its obligations under data protection law in any way?

3.     Can Russmedia Digital SRL avail of the intermediary liability exemptions found in the E-Commerce Directive?

 

Each group is asked to prepare a statement of claim (2 A4 pages), setting out its key arguments in relation to each of these questions and to be prepared to present these arguments (10 minutes in total) before the Court. The statement of claim should be submitted to the Court via Moodle by 6pm on Monday 23 March 2026.

 

Following this Moot, we will have an open discussion of some of the key issues raised by the judgment. Consider in particular:

1.     Whether it is simultaneously possible for an entity to be sufficiently ‘neutral’ to benefit from an intermediary liability exemption under the E-Commerce Directive and exercising sufficient control over personal data processing under the GDPR to be a data controller?

2.     What the practical implications of this judgment may be. What implications will it have for websites that host advertising?

Readings

Read Before Seminar  Martin Husovec, Principles of the Digital Services Act (OUP, 2024), pp.401-407 (section 18.4; Chapter 17 if desired).

On the interplay between the Digital Services Act and the GDPR.

Recommended                Orla Lynskey, ‘Complete and Effective Data Protection’ (2023) 76 Current Legal Problems 297

For a discussion of controllership and the potential and challenges of the responsibilities doctrine.

Recommended                Michèle Finck, ‘Cobwebs of Control: The Two Imaginations of the Data Controller in EU Law’ (2021) 11 International Data Privacy Law 333.

On the challenges of an expansive interpretation of controllership. 

Recommended                Frank H. Easterbrook, ‘Cyberspace and the Law of the Horse’ [1996] University of Chicago Legal Forum 207.

For you to consider its continuing relevance as a critique in light of Russmedia!

Cases

European Union

Read Before Seminar  Case C-492/23, X v Russmedia Digital SRL and Inform Media Press SRL ECLI:EU:C:2025:935

Read Before Seminar  Case C-492/23, X v Russmedia Digital SRL and Inform Media Press, ECLI:EU:C:2025:68 (Opinion of Advocate General Szpunar).

Read Before Seminar  Case C136/17 GC and Others v Commission nationale de l’informatique et des libertés (CNIL) ECLI:EU:C:2019:773.

In this case, the Court confirms the ‘responsibilities doctrine’. We might wonder what the status of this doctrine is following Russmedia Digital SRL.

Recommended                Joined Cases C-682/18 and C-683/18, Peterson v Google LLC ECLI:EU:C:2021:503

In this case, the Court finds that the operator of an online platform does not ‘communicate to the public’ for copyright purposes when it makes illegal content available, and considers the relationship between the Copyright Directive and the E-Commerce rules.

Recommended                Case C-184/20, OT ECLI:EU:C:2022:601 (Opinion of AG Pikamäe)

On when an inference is deemed sensitive under data protection law (requiring additional legal justification.

Recommended                Case C-49/17 Fashion ID GmbH & CoKG v Verbraucherzentrale NRW eV ECLI:EU:C:2018:1039 (Opinion of AG Bobek).

In this Opinion, the Advocate General explores some of the challenges of a broad interpretation of data controllership.