Artwork

İçerik John Danaher tarafından sağlanmıştır. Bölümler, grafikler ve podcast açıklamaları dahil tüm podcast içeriği doğrudan John Danaher veya podcast platform ortağı tarafından yüklenir ve sağlanır. Birinin telif hakkıyla korunan çalışmanızı izniniz olmadan kullandığını düşünüyorsanız burada https://tr.player.fm/legal özetlenen süreci takip edebilirsiniz.
Player FM - Podcast Uygulaması
Player FM uygulamasıyla çevrimdışı Player FM !

Episode #36 – Wachter on Algorithms, Explanations and the GDPR

 
Paylaş
 

Manage episode 196863199 series 1328245
İçerik John Danaher tarafından sağlanmıştır. Bölümler, grafikler ve podcast açıklamaları dahil tüm podcast içeriği doğrudan John Danaher veya podcast platform ortağı tarafından yüklenir ve sağlanır. Birinin telif hakkıyla korunan çalışmanızı izniniz olmadan kullandığını düşünüyorsanız burada https://tr.player.fm/legal özetlenen süreci takip edebilirsiniz.

s200_sandra.wachter.jpgIn this episode I talk to Sandra Wachter about the right to explanation for algorithmic decision-making under the GDPR. Sandra is a lawyer and Research Fellow in Data Ethics and Algorithms at the Oxford Internet Institute. She is also a Research Fellow at the Alan Turing Institute in London. Sandra’s research focuses on the legal and ethical implications of Big Data, AI, and robotics as well as governmental surveillance, predictive policing, and human rights online. Her current work deals with the ethical design of algorithms, including the development of standards and methods to ensure fairness, accountability, transparency, interpretability, and group privacy in complex algorithmic systems.

You can download the episode here or listen below. You can also subscribe on iTunes and Stitcher (the RSS feed is here).

Show Notes

  • 0:00 – Introduction
  • 2:05 – The rise of algorithmic/automated decision-making
  • 3:40 – Why are algorithmic decisions so opaque? Why is this such a concern?
  • 5:25 – What are the benefits of algorithmic decisions?
  • 7:43 – Why might we want a ‘right to explanation’ of algorithmic decisions?
  • 11:05 – Explaining specific decisions vs. explaining decision-making systems
  • 15:48 – Introducing the GDPR – What is it and why does it matter?
  • 19:29 – Is there a right to explanation embedded in Article 22 of the GDPR?
  • 23:30 – The limitations of Article 22
  • 27:40 – When do algorithmic decisions have ‘significant effects’?
  • 29:30 – Is there a right to explanation in Articles 13 and 14 of the GDPR (the ‘notification duties’ provisions)?
  • 33:33 – Is there a right to explanation in Article 15 (the access right provision)?
  • 37:45 – Is there any hope that a right to explanation might be interpreted into the GDPR?
  • 43:04 – How could we explain algorithmic decisions? Introducing counterfactual explanations
  • 47:55 – Clarifying the concept of a counterfactual explanation
  • 51:00 – Criticisms and limitations of counterfactual explanations

Relevant Links

  continue reading

64 bölüm

Artwork
iconPaylaş
 
Manage episode 196863199 series 1328245
İçerik John Danaher tarafından sağlanmıştır. Bölümler, grafikler ve podcast açıklamaları dahil tüm podcast içeriği doğrudan John Danaher veya podcast platform ortağı tarafından yüklenir ve sağlanır. Birinin telif hakkıyla korunan çalışmanızı izniniz olmadan kullandığını düşünüyorsanız burada https://tr.player.fm/legal özetlenen süreci takip edebilirsiniz.

s200_sandra.wachter.jpgIn this episode I talk to Sandra Wachter about the right to explanation for algorithmic decision-making under the GDPR. Sandra is a lawyer and Research Fellow in Data Ethics and Algorithms at the Oxford Internet Institute. She is also a Research Fellow at the Alan Turing Institute in London. Sandra’s research focuses on the legal and ethical implications of Big Data, AI, and robotics as well as governmental surveillance, predictive policing, and human rights online. Her current work deals with the ethical design of algorithms, including the development of standards and methods to ensure fairness, accountability, transparency, interpretability, and group privacy in complex algorithmic systems.

You can download the episode here or listen below. You can also subscribe on iTunes and Stitcher (the RSS feed is here).

Show Notes

  • 0:00 – Introduction
  • 2:05 – The rise of algorithmic/automated decision-making
  • 3:40 – Why are algorithmic decisions so opaque? Why is this such a concern?
  • 5:25 – What are the benefits of algorithmic decisions?
  • 7:43 – Why might we want a ‘right to explanation’ of algorithmic decisions?
  • 11:05 – Explaining specific decisions vs. explaining decision-making systems
  • 15:48 – Introducing the GDPR – What is it and why does it matter?
  • 19:29 – Is there a right to explanation embedded in Article 22 of the GDPR?
  • 23:30 – The limitations of Article 22
  • 27:40 – When do algorithmic decisions have ‘significant effects’?
  • 29:30 – Is there a right to explanation in Articles 13 and 14 of the GDPR (the ‘notification duties’ provisions)?
  • 33:33 – Is there a right to explanation in Article 15 (the access right provision)?
  • 37:45 – Is there any hope that a right to explanation might be interpreted into the GDPR?
  • 43:04 – How could we explain algorithmic decisions? Introducing counterfactual explanations
  • 47:55 – Clarifying the concept of a counterfactual explanation
  • 51:00 – Criticisms and limitations of counterfactual explanations

Relevant Links

  continue reading

64 bölüm

Toate episoadele

×
 
Loading …

Player FM'e Hoş Geldiniz!

Player FM şu anda sizin için internetteki yüksek kalitedeki podcast'leri arıyor. En iyi podcast uygulaması ve Android, iPhone ve internet üzerinde çalışıyor. Aboneliklerinizi cihazlar arasında eş zamanlamak için üye olun.

 

Hızlı referans rehberi