Skip to main content

Please note: this event has passed

View the event images

The YTL Centre usually holds three 'Law and Justice' forums per year. These are one day events that discuss major questions of the day by bringing to bear inter-disciplinary perspectives from politics, philosophy and law.

The aim of the Forum on 'Technology and Manipulation' is to explore how recent technologies can be used (and have been used) to influence our behaviour in ways that bypass our capacity to deliberate as autonomous agents. This is the case both in relation to exercises of individual autonomous agency (for example, when particular software tracks our behaviour and directs us to make certain purchases or adopt a certain lifestyle) and in relation to exercises of collective agency (a recent example is the Russian interference in a number of elections via the use of bots and fake social media accounts). To explore this issue, this forum will bring together legal and political philosophers as well as experts in computer science and communication.

The forum will be organized around three themes:

  • Communication and Free Speech
  • The Self
  • The Global Order

This event is funded by the YTL Centre with the support of the Society for Applied Philosophy.



11:00 – 11:15     Welcome and Introduction

11:15 – 13:30     Communication and Free-Speech

Siva Vaidhyanathan (University of Virginia), “The Cacophony: Rethinking Free Speech”

Dr. James Williams (Oxford University), “The Grammar of Influence”

Onora O'Neill (Cambridge), “Communication and Democracy in a Digital Age”

13:30 – 14:30     Lunch

14:30 – 16:00     The Self

Brett Frischmann (Villanova University), “Reverse Turing Tests for Robotic Human Beings”

Tom Douglas (Oxford), “Technological Manipulation and the Right to Mental Integrity”

16:00 – 16:30     Coffee/Tea

16:30 – 18:30     The Global Order

Samantha Bradshaw (Oxford / Canadian International Council), “The Global Organization of Social Media Manipulation”

Scott Shapiro (Yale/UCL), “Can Hacking be an Act of War?”



Dr. Samantha Bradshaw (Oxford University/ Canadian International Council)

“The Global Organization of Social Media Manipulation”

Social media has become a proxy for political power. Around the world, governments and political parties have exploited the algorithms and advertising infrastructure of social media platforms to spread disinformation and target users with propaganda. Drawing on three years of data collection about the strategies and resources states use to manufacture consensus and automate suppression on social media, this paper examines the global trends of organized disinformation campaigns. It will highlight the emerging tools and techniques of social media manipulation, and comparatively assess the capacity of different states to carry out these activities.


Dr. Tom Douglas (Oxford University)

“Technological Manipulation and the Right to Mental Integrity”

Medical law and conventional medical ethics protect us against the technological manipulation of our bodies, in part through recognising and enforcing a right to bodily integrity—understood here as a right against interference with the body. In this talk, I will explore the possibility of that we might protect ourselves against the technological manipulation of our minds through recognising an analogous right to mental integrity—understood as a right against interference with the mind. In the first half of the talk, I will present the case in favour of recognising such a right. In the second half, I will raise two problems faced by proponents of the right. I'll end by suggesting that we face a trilemma: the right to mental integrity is either (a) philosophically unsupported, (b) too weak to provide our minds with the protection we seek, or (c) so broad and strong as to render impermissible many widespread and seemingly innocuous forms of behavioural influence.


Professor Brett Frischmann (Villanova University)

“Reverse Turing Tests for Robotic Human Beings”

As the scale and scope of techno-social engineering of humans grows more pervasive and intimate through the ubiquitous embedding of networked sensors in public and private spaces, our devices, our clothing, and ourselves, it’s become an urgent ethical and political matter to determine if the technologies that lead people to behave robotically are dehumanizing. This question motivated me to develop a new form of inquiry: reverse Turing tests. I first will provide a few illustrative examples of modern techno-social engineering. Next, I will examine the roots of modern techno-social engineering in Taylorism, Skinnerian behaviorism, behavioral psychology and economics, and studies of human-computer interactions. Finally, I will explain how reverse Turing tests would work and illustrate with a few examples.


Baroness Onora O'Neill (Cambridge University)

“Communication and Democracy in a Digital Age”

Ethical and epistemic standards for communication have been discussed since antiquity. And since antiquity they   have periodically been disrupted by technological innovations, then revised and reinforced by cultural and latterly by legal and regulatory measures. However, the transformations produced by the mushrooming growth of digital technologies in the late C20, which has coincided with growing globalisation and the declining regulatory capacities of states, may prove particularly challenging. These technologies were initially seen as extending possibilities for communication in ways that would support democracy and wider civic   participation. The promise has not been sustained. I shall comment on some reasons for this disappointing result,  including   the dominant position of ‘freedom of expression’ in contemporary discussion  of norms that bear on speech acts; the ease with which anonymous speech  can bypass normative standards, and the difficulty of identifying legally or institutionally robust ways of securing or enforcing standards for digitally transmitted content. Cultural, legal and regulatory approaches to these problems are all likely to prove difficult and controversial. 


Professor Scott Shapiro (Yale / UCL)

“Can Hacking be an Act of War?”

Lawyers generally agree that computer hacking can count as a casus belli if these intrusions have destructive kinetic effects.  If a hack causes a building to blow up, or people to die, the cyber attack may violate the UN Charter’s prohibition on “the threat or use of force.” It might even count as an “armed attack” justifying self-defense in response. In this talk, I will ask whether cyber-attacks that do not have destructive kinetic effects, but manipulate and distort the political process of a country, can count as an act of war.  I will argue that the answer is yes and will explore the relationship between war, force and fraud in the cyber-age.


Professor Siva Vaidhyanathan (University of Virginia)

“The Cacophony: Rethinking Free Speech”

Whether considering social media moderation, news regulation, or university policies and practices, discussions and debates about speech invariably focus on the quantity of speech rather than the quality of speech. Overdetermined by the United States’ Constitutional principles embodied in the First Amendment, these debates frame conflicts in eighteenth-century terms. This talk urges an update and revision of the framework around these conflicts, one that recognizes that increasing the amount of speech is a problem solved long ago. Today the challenge is filtering and discriminating among speakers and subjects so that citizens may govern themselves and make good decisions about their lives.


Dr. James Williams (Oxford University)

“The Grammar of Influence”

Any analysis of technological influence, whether descriptive or normative, depends on the ability to make clear distinctions between the diverse forms of influence generally. At present, however, there exists no vocabulary sufficient to afford this clarity. The language of influence remains a cloudy, fragmented landscape full of imprecise notions and ill-considered, domain-specific jargon. Against this background of verbal confusion, important work that aims to understand and guide novel forms of technological influence cannot usefully proceed. What is urgently needed, therefore, is a push to clarify and defragment the language of influence. In this talk I will discuss the development of such an effort: a ‘grammar of influence,’ i.e. a logic for typologizing the varieties of influence into one common, coherent construct. I will discuss the opportunities and limitations of such an approach, ways in which it might be operationalized, and why it may represent a form of cultural stewardship that is particularly important in the digital era.


Attendance for the entire event is not compulsory.

This event is open to the public and everybody is welcome to attend, though everyone must register.

Seats are allocated on a strictly first come, first served basis. 

If you find you can no longer attend please cancel your ticket registration, so that someone else can have your place.

Society for Applied Philosophy
Society for Applied Philosophy

Event details

BH Lecture Theatre 2 & BH (S) 4.03
Bush House
Strand campus, 30 Aldwych, London, WC2B 4BG