DIGITAL LIBRARY
ADDRESSING TRUST CONCERNS IN EDUCATIONAL ENVIRONMENTS: DEVELOPING AN EXPLAINABLE EMBODIED CONVERSATIONAL AGENT
Hochschule Karlsruhe (GERMANY)
About this paper:
Appears in: EDULEARN24 Proceedings
Publication year: 2024
Pages: 3033-3039
ISBN: 978-84-09-62938-1
ISSN: 2340-1117
doi: 10.21125/edulearn.2024.0805
Conference name: 16th International Conference on Education and New Learning Technologies
Dates: 1-3 July, 2024
Location: Palma, Spain
Abstract:
Technological advancements have significantly impacted educational technologies, offering new avenues to engage learners effectively. Virtual Reality (VR) holds promise in this regard, providing personal and social affordances crucial for immersive educational experiences (Grivokostopoulou et al., 2020).

Interfaces in educational technologies play a pivotal role in enhancing engagement and comprehension. VR interfaces increasingly incorporate avatars representing users and Embodied Conversational Agents (ECAs) simulating human-like interactions through natural language processing and visually appealing representations (Aljaroodi et al., 2019). For instance, in distance education, the lack of personal and social presence is a noted concern. ECAs can address this by enhancing social interaction and motivation (Fitton et al., 2020).

While ECAs offer educational potential, trust issues arise. Chiou et al. (2020) present a study suggesting that trust impacts learning concepts and retention. Therefore, ensuring trust is also an important factor in the educational process. Explainability methods can mitigate trust concerns by explaining their reasoning (justification), explaining their view of the environment (internal state), or explaining their plans (intent) (Wallkötter et al., 2021). Explainable ECAs fall under eXplainable Artificial Intelligence, using similar techniques but with the difference of aiming for autonomous interaction with humans and the environment. Verbal and non-verbal modes of explainability in ECAs offer users diverse channels for comprehension, enhancing trust and enriching user experiences. Specifically, verbal explainability methods, frequently utilizing text-to-speech technologies, offer spoken interpretations of complex information within ECAs.

This paper will explore the efficacy of explainability methods in ECAs within educational environments, assessing their impact on trust, user motivation and engagement. We propose a customizable ECA for educational platforms, offering varied personalities and tones of voice —ranging from enthusiastic to apathetic— during the explanation. Users can tailor the ECA's emotional tones and voice characteristics, including pitch, tempo, gender, and frequency, ensuring a responsive interaction experience aligned with instructional preferences and emotional context.

References:
[1] H. M. Aljaroodi, M. T. P. Adam, R. Chiong, and T. Teubner, ‘Avatars and Embodied Agents in Experimental Information Systems Research: A Systematic Review and Conceptual Framework’, Australasian Journal of Information Systems, vol. 23, Oct. 2019, doi: 10.3127/ajis.v23i0.1841.
[2] E. K. Chiou, N. L. Schroeder, and S. D. Craig, ‘How we trust, perceive, and learn from virtual humans: The influence of voice quality’, Computers & Education, vol. 146, Mar. 2020, doi: 10.1016/j.compedu.2019.103756.
[3] I. S. Fitton, D. J. Finnegan, and M. J. Proulx, ‘Immersive virtual environments and embodied agents for e-learning applications’, PeerJ Comput. Sci., vol. 6, Nov. 2020, doi: 10.7717/peerj-cs.315.
[4] F. Grivokostopoulou, K. Kovas, and I. Perikos, ‘The Effectiveness of Embodied Pedagogical Agents and Their Impact on Students Learning in Virtual Worlds’, Applied Sciences, vol. 10, no. 5, Jan. 2020, doi: 10.3390/app10051739.
[5] S. Wallkötter, S. Tulli, G. Castellano, A. Paiva, and M. Chetouani, ‘Explainable Embodied Agents Through Social Cues: A Review’, J. Hum.-Robot Interact., vol. 10, no. 3, Jul. 2021, doi: 10.1145/3457188.
Keywords:
Embodied Conversational Agent, Explainability, Trust, Artificial Intelligence.