CHALLENGE-BASED LEARNING IN EXPLAINABLE ARTIFICIAL INTELLIGENCE EDUCATION
Universidad Politécnica de Madrid (SPAIN)
About this paper:
Conference name: 13th International Conference on Education and New Learning Technologies
Dates: 5-6 July, 2021
Location: Online Conference
Abstract:The uprising of Data Science (DS) as a tool to get insights from big amounts of data has disrupted business, health care, politics, education, and research, among other fields. The data scientist profile is highly demanded in all those areas, and even domain experts are encouraged to develop these kinds of skills. As a result, the number of DS Masters and online courses has rocketed in the past decade. DS is an interdisciplinary field that overlaps with Machine Learning (ML) and Deep Learning (DL), both of them contained within Artificial Intelligence (AI). However, there is a growing concern about the inability of many successful AI models to produce reasonable explanations about their decisions. Explainable Artificial Intelligence (XAI) aims to bridge this gap by leveraging AI models whose decisions are easily understandable by human beings.
Under this scenario, it is paramount that DS students develop the critical thinking skills to endow AI models with the explainable dimension. This is particularly the case when dealing with critical problems, such as microRNA recognition for cervical cancer detection. This paper contributes:
(1) using and adapting the Challenge-Based Learning (CBL) framework for the XAI topic;
(2) designing specific XAI challenges that have been explored by students of a Master in Computational Biology; and,
(3) proposing tools that support the learning of XAI and which combine several AI fields, such as ML, expert systems, and Case-Based Reasoning (CBR).
Evaluation results indicate that decision rules are the most trusted models for critical problems (followed by decision trees and CBR).
Keywords: Challenge-Based Learning, Data Science, Explainable Artificial Intelligence.