Explainable Artificial Intelligence for Kids
- DOI
- 10.2991/eusflat-19.2019.21How to use a DOI?
- Keywords
- Explainable AI Natural Language Generation Human-Computer Interaction Decision Trees Fuzzy rule-based Classifiers
- Abstract
Artificial Intelligence (AI) is part of our everyday life and has become one of the most outstanding and strategic technologies of the 21st century. Explainable AI (XAI in short) is expected to endow AI systems with explanation ability when interacting with humans. This paper describes how to provide kids with natural explanations, i.e., explanations verbalized in Natural Language, in the context of identifying/recognizing roles of basketball players. Semantic grounding is achieved through fuzzy concepts such as tall or short. Selected players are automatically classified by an ensemble of three different decision trees and one fuzzy rule-based classifier. All the single classifiers were first trained with the open source Weka software and then natural explanations were generated by the open source web service ExpliClas. The Human-Computer Interaction interface is implemented in Scratch, that is a visual programming language adapted to kids. The developed Scratch program is used for dissemination purposes when high-school teenagers visit the Research Center in Intelligent Technologies of the University of Santiago de Compostela.
- Copyright
- © 2019, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Jose M. Alonso PY - 2019/08 DA - 2019/08 TI - Explainable Artificial Intelligence for Kids BT - Proceedings of the 11th Conference of the European Society for Fuzzy Logic and Technology (EUSFLAT 2019) PB - Atlantis Press SP - 134 EP - 141 SN - 2589-6644 UR - https://doi.org/10.2991/eusflat-19.2019.21 DO - 10.2991/eusflat-19.2019.21 ID - Alonso2019/08 ER -