Interest in the ethical implications of AI has exploded in the past few years, and various codes and commitments have been established across academia, industry, and policy. These all emphasise similar things: that AI should be used for the benefit of humanity, must respect widely held values such as privacy, justice, and autonomy, and must be made interpretable for humans. While agreeing on these principles is valuable, it’s still far from clear how we implement them in practice.One challenge is that widely-agreed principles come into conflict in concrete cases. It’s not clear how to resolve these tensions: how much privacy should we be willing to sacrifice in developing life-saving technologies, for example? How do we get the benefits of data-driven personalisation without threatening important societal values like solidarity? We’ll discuss some problems with how the ethical issues surrounding AI are often talked about, and highlight some key dilemmas we need to face in turning principles into practice.A second challenge is that many of the goals of AI ethics are vastly underspecified. Here we’ll focus particularly on interpretability, widely considered crucial for ensuring ethical real-world deployment of intelligent systems. Part of the reason interpretability is deemed so important is that it helps us to ensure that other goals are met in the development and use of AI systems: that they are safe, reliable, and fair. The volume of research on interpretability is rapidly growing, but there is still little consensus on what interpretability is, how to measure and evaluate it, and how to control it. There is an urgent need for these issues to be rigorously addressed. We’ll shed light on these issues related to interpretability as well as state-of-the-art machine learning algorithms.
Join us for a fascinating discussion at this AI SIG byte-size event at Amazon Cambridge Development Center, followed by networking and refreshments.
Photo Credits: 6eo tech
License Details: https://creativecommons.org/licenses/by/2.0/
Photo by JESHOOTS.com from Pexels.