
Theorizing Platform Content Moderation: Power, Resistance, and Democratic Control
João C. Magalhães (University of Groningen); Naomi Appelman (University of Amsterdam)
Humanities Bridgeford Street Building: Room G.35
Platform content moderation – how private digital intermediaries define and control what is objectionable and desirable – has emerged as a central contemporary form of mass speech governance, able to influence billions of people globally. Much of the growing scholarship focuses on describing its functioning, complexities, and technologies, and on trying to reform platforms by holding them to constitutional values. Yet, despite its obvious political nature, content moderation remains under-conceptualized as a (global) political practice.
This is puzzling as moderation rearticulates key concepts of political theory. Recent years made clear that platforms, regardless of their unilateral ability to moderate, often seek to appease some actors in the design and enforcement of their moderation rules and technologies. These processes are hardly linear, though: not all voices, from all countries, at all times, are heard. These dynamics have shown to reinforce systems of social and global oppression such as racism, sexism, or neo-colonialism in a way that is intimately connected to these companies’ global political economic interests. This evidences the need to understand how moderation relates to representation, recognition, and plurality, which are closely related with matters of justice, equality, and dignity. Similarly, it calls for understanding resistance to these systems as well as the patterns of in- and exclusion.
Two factors make these aspects challenging to understand or address through usual normative frameworks, such as legal rights. Firstly, platforms are a peculiar kind of organization: globally operating corporations with an oversized influence over the universal moral needs of socialization and individual expression. In other words, while their immense power is not anchored in usual processes of political legitimation (e.g., elections) or even a polity, and often remains legally protected by so-called ‘safe harbour’ laws, these companies still owe us something – but what, exactly, and how do we define this ‘us’? Further, much of content moderation today is automated through machine learning systems. The meaning of “objectionable” or “desirable”, or how to punish those who violate these definitions, may thus emerge not from direct human reasoning but from probabilistic calculations driven by complexly constructed datasets. Whose voice is represented and silenced when thousands of data annotators, moderators, officers, and technologists play some role in the construction of the algorithms that spot, say, hate speech? How to account for the cascading layers of rules, institutions, and actors?
Workshop aims:
This workshop aims to address the urgent task of theorizing platform content moderation. We especially welcome scholars working from the perspective of radical democratic theory, democratic resistance, decolonial theory, and political economy to consider three broad questions:
- How should we conceptualize content moderation as a form of power, and in which ways does it differ from previous forms of speech control?
- What does proper resistance to moderation mean, and how can it tackle the multiple dynamics of in- and exclusion?And
- To what extent and how should democratic control over content moderation be organised?
Monday 11th September: Content moderation and the political: Key questions |
|
11:00-12:30 |
Registration |
12:30-13:30 |
Lunch |
13:30-14:00 |
Welcome Speech |
14:00-16:00 |
Session 1 Julian Morgan (Humboldt-University of Berlin): Federalising Content Moderation. Naomi Appelman (University of Amsterdam): Resisting algorithmic content moderation. |
16:00-16:30 |
Tea and Coffee Break (optional) |
16:30-17:30 |
Session 1 (continued) Stefan Luca (University of Glasgow): Tethered to their own web: Reclaiming networked governance from digital platforms. |
17:45-19:00 |
Wine Reception |
19:30 |
Conference Dinner |
Tuesday 12th September: The diverse geographies and domains of content moderation |
|
9:45-11:30 |
Session 2 Yohannes Eneyew AyaLew (Monash University) [online]: Questioning (under)representation in platform decision making: Facebook’s language blind spots during content moderation in Ethiopia. Anupriya Dhonchak (Oxford University): A postcolonial approach to content moderation via publicity rights. |
11:30-12:00 |
Tea and Coffee Break (optional) |
12:00-13:00 |
Session 2 (continued) Admire Mare (University of Johannesburg) [online]: Platform content moderation in the Global South: decolonial and social justice agenda. |
13:00-14:00 |
Lunch |
14:00-16:00 |
Session 3 Zhen Ye (Erasmus University Rotterdam), Tonny Krijnen (Erasmus University Rotterdam), Qian Huang (University of Groningen): Douyin’s playful platform governance: Platform’s self-regulation and content creators’ participatory surveillance. Marcelo Thompson (University of Hong Kong): The normative duty of care: Bridging rule of law imaginaries in platform governance. |
16:00-16:30 |
Tea and Coffee Break (optional) |
16:30-17:30 |
Session 3 (continued) Alessia Zornetta (University of California, Los Angeles; McGill University): Clearing the haze: How the DSA is forcing platforms to open up about content moderation.
|
Wednesday 13th September: The multiple actors of content moderation |
|
9:30-11:30 |
Session 4 Bhanuraj Kashyap (Macquarie University) [online]: Social Media platforms and moral deskilling: How can we deliberate together in the digital age? João C. Magalhães (University of Groningen) & Holly Avella (Rutgers University): Moderating (through) emotions: Techs of mood-eration and the shifting foundations of speech control.
|
11:30-12:00 |
Tea and Coffee Break (optional) |
12:00-13:00 |
Session 4 (continued) Lucas Henrique Muniz da Conceição (Bocconi University): Is it a constitutional reflector? An assessment of digital constitutionalism in Meta’s Oversight Board.
|
13:00-14:00 |
Lunch |
14:00-16:00 |
Session 5 Roxana Radu (University of Oxford) [online]: New forms of hybrid governance: On the social order of digital platforms. Paul Gowder (Northwestern University): The Networked Leviathan.
|
16:00 |
End of the Workshop, Final Remarks |