首页 | 本学科首页   官方微博 | 高级检索  
     检索      


The dual function of explanations: Why it is useful to compute explanations
Institution:1. School of Law, University of Southampton, UK;2. Department of Informatics, King’s College London, UK;1. Observer Research Foundation, New Delhi, India;2. Max Planck Institute for Innovation and Competition, Munich, Germany;3. Attorney at Law (New York), Technical University Munich (TUM), School of Management and Affiliated Research Fellow at the Max Planck Institute for Innovation and Competition, Munich, Germany;1. Institute for Chinese Legal Modernization Studies, law School of Nanjing Normal University, Collaborative Innovation Center for Regional Rule of Law in Jiangsu, No.1 Wenyuan Road, Nanjing 210023, China;2. School of Intellectual Property of Nanjing University of Science & Technology, Intellectual Property Development Research Center of Jiangsu Province, No.200 Xiaolingwei Street, Nanjing 210094, China;3. law School of Zhejiang University, No.51 Zhijiang Road, Hangzhou 310008, China;1. Department of Law, Xi''an Jiaotong University, Xianning West Road, Beilin District, Xi''an City, Shaanxi Province, 710049, China;2. Department of Law, Xi''an Jiaotong University, Shaanxi Province, China
Abstract:Whilst the legal debate concerning automated decision-making has been focused mainly on whether a ‘right to explanation’ exists in the GDPR, the emergence of ‘explainable Artificial Intelligence’ (XAI) has produced taxonomies for the explanation of Artificial Intelligence (AI) systems. However, various researchers have warned that transparency of the algorithmic processes in itself is not enough. Better and easier tools for the assessment and review of the socio-technical systems that incorporate automated decision-making are needed. The PLEAD project suggests that, aside from fulfilling the obligations set forth by Article 22 of the GDPR, explanations can also assist towards a holistic compliance strategy if used as detective controls. PLEAD aims to show that computable explanations can facilitate monitoring and auditing, and make compliance more systematic. Automated computable explanations can be key controls in fulfilling accountability and data-protection-by-design obligations, able to empower both controllers and data subjects. This opinion piece presents the work undertaken by the PLEAD project towards facilitating the generation of computable explanations. PLEAD leverages provenance-based technology to compute explanations as external detective controls to the benefit of data subjects and as internal detective controls to the benefit of the data controller.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号