igf-2018-ws-421-algorithmic-transparency-and-the-right-to-explanation.txt 4.7 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596
  1. IGF 2018 WS #421 Algorithmic transparency and the right to explanation
  2. Format:
  3. Break-out Group Discussions - 60 Min
  4. Theme:
  5. Cybersecurity, Trust and Privacy
  6. Subtheme:
  7. ALGORITHMS
  8. Organizer 1:
  9. Alex Comninos
  10. , Association for Progressive Communications
  11. Organizer 2:
  12. Deborah Brown
  13. , Association for Progressive Communications
  14. Speaker 1:
  15. Jelena Jovanovic
  16. , Technical Community, Eastern European Group
  17. Speaker 2:
  18. Vidushi Marda
  19. , Civil Society, Asia-Pacific Group
  20. Speaker 3:
  21. Alex Comninos
  22. , Civil Society, African Group
  23. Session Content:
  24. I - Introduction to the issues by the speakers (25 minutes) The speakers will in twenty minutes (five minutes per speaker) introduce the problems posed from a human rights perspective of automated or algorithmic decision-making. Algorithmic justice, algorithmic bias, and algorithmic transparency shall be introduced as concepts.The technical, legal and human rights issues will also be posed. 2 - Break out group discussions I [25 minutes] Groups will ask how algorithms affect their lives and identify problems that algorithms could cause for them. 15 minutes Groups will report back 10 minutes 3- Break out Group Discussion II [25 minutes] Groups will discuss technical and policy solutions to ensuring algorithms can provide a right to explanation. 15 minutes Groups will report back 10 minutes 4. Panel discussion of Group's responses 10 minutes The speakers will respond to the report backs and issues raised by the groups. 5. Questions from audience to panelists 15 minutes
  25. Interventions:
  26. - Alex Comninos and Deborah brown (APC) will be moderators - Jelena Jovanovic (cyber security professional) will provide an overview of the concepts of algorithmic transparency, algorithmic justice, algorithmic bias and real life examples of the effects of algorithms from an information security perspective. - Vidushi Marda (Article 19) will provide an overview of the human rights aspects of automated decision making. She will focus on the GDPR Article 22 and the EU guidelines on Automated decision-making. She will provide a policy and human rights perspective.
  27. Diversity:
  28. Alex Comninos
  29. , Civil Society, Male, Africa Group
  30. Deborah Brown
  31. , Civil Society, Female, Western Europe and others Group
  32. Jelena Jovanovic
  33. , Technical Community, Female, Eastern European Group
  34. Vidushi Marda
  35. , Civil Society, Female, Asia-Pacific Group
  36. Lorena Jaume-Palasi:
  37. Civil Society, Female, Western Europe and others Group
  38. Joy Liddicoat
  39. :
  40. Academia, Female, Western Europe and others Group
  41. Karen Reilly
  42. : Technical Community, Female, Western Europe and others Group
  43. Chinmayi Arun
  44. , Academia, Female, Asia-Pacific Group
  45. Malavika Jayaram
  46. , Civil Society, Female, Asia-Pacific Group
  47. Online Participation:
  48. All breakaway groups will have to chat and come up with an output for presentation, whether online or offline. Online participation will be done via breakaway groups on the collaborative online notepad etherpad - which would allow the participants to chat as well as to come up with a document for presentation at the feedback sessions The online participation will be advertised through Twitter and the RP platforms. Online participation can be done in groups that have IGF "real life participants too" will be encouraged to use etherpad to develop and report on their discussions. An ideal online participation outcome would involve the on-site and offline participants both working on the same etherpads, thus building bridges with RP.
  49. Discussion Facilitation:
  50. The speakers each introduce briefly their own concerns and interventions regarding AI
  51. Groups will ask how algorithms affect their lives and identify problems that algorithms could cause. The groups will be broken up thematically.
  52. One person shall report back from each group
  53. Group discussion shall feed into a final outcome document.
  54. For more info on group discussion, see the agenda.
  55. Onsite Moderator:
  56. [email protected]
  57. Online Moderator:
  58. [email protected]
  59. Rapporteur:
  60. Association for Progressive Communications
  61. Agenda:
  62. Part 1:
  63. Lightning talks
  64. -
  65. 25 minutes
  66. - Each speaker gives a "
  67. lighting talk
  68. " of max 2 minutes on their specific area of intervention/expertise.
  69. Part 2:
  70. Breakaway group discussion -
  71. 20 Minutes
  72. - Breakaway groups discussing different aspects of algorithmic transparency
  73. - The remote participants will organise an internet breakaway group
  74. - Someone from each group volunteers to rapporteur
  75. Part 3:
  76. Report back from
  77. breakaway group discussions
  78. -
  79. 10 Mintes
  80. - Rapporteurs report back and display their flip charts
  81. - Remote participants, the internet reports back
  82. - Some panelists take notes and document in order to create an outcome document for the event.
  83. Part 4:
  84. Questions
  85. -
  86. 5 - 10 minutes
  87. Wrap up with questions and interventions from audience and remote participants.
  88. Session Time:
  89. Monday, 12 November, 2018 -
  90. 11:20
  91. to
  92. 12:20
  93. Room:
  94. Salle IX