Please note: this event has passed
Legal explanations of artificial intelligence (AI) are shaping its future. The General Data Protection Regulation includes a 'right to explanation' that secures a limited opening of the algorithmic 'black box', potentially enabling challenges to automated decisions by data subjects. But data protection is far from the only field in which automated decision systems, their developers and implementers have duties to explain themselves. For example, the law variously requires explanations as to whether applications of these technologies may have played a part in discriminatory treatment, abuses of market power, negligent material harms, and excessive state or private surveillance.
What remains unclear is whether the explanations currently being generated are sufficient to maintain justice and the rule of law in the context of rapid social and economic transformation. This event considers whether this might be a good time to re-engage UK policymakers on the benefits of including relevant civil society representatives and legal practitioners in a more constructive way. It invites figures currently leading relevant contestation actions across key sectors (criminal justice, health, education, environmental regulation and constitutional affairs) to discuss with interested researchers and policymakers their experience of using the law to obtain explanations of data driven decision making, as well as to reflect on shared policy priorities and how best to pursue them.
This event is co-organised by BIICL and The Dickson Poon School of Law, King's College London.
Event convened by Dr Irene Pietropaoli (BIICL)