DIGITAL LIBRARY
TRAINING EMOTION DETECTION FOR AUTISTIC PERSONS WITH AI AND AR
1 University of Alicante, University Institute for Computer Research (SPAIN)
2 University of Alicante, Department of Development Psychology and Teaching (SPAIN)
About this paper:
Appears in: INTED2022 Proceedings
Publication year: 2022
Pages: 953-958
ISBN: 978-84-09-37758-9
ISSN: 2340-1079
doi: 10.21125/inted.2022.0307
Conference name: 16th International Technology, Education and Development Conference
Dates: 7-8 March, 2022
Location: Online Conference
Abstract:
The ability to detect and recognize the emotions of the interlocutors is very important for the communication process. The mood and emotion felt by the other person influences the outcome of a conversation, because if a person is feeling happy, a joke would not be out of place. On the other hand, if the person is feeling sad, you should avoid upsetting them. Most people are able to perform this task intuitively and seamlessly, but there are a number of people who cannot. For example, a trait of autistic people is the inability to detect and recognize other people's emotions. As a result, they are likely to react unexpectedly during the course of a conversation.

Due to the importance of being able to recognize the emotion felt by other people in order to communicate effectively and efficiently, we propose a system to train this skill for autistic people. The system, which is deployed in an augmented reality (AR) headset, is able to automatically detect the emotion of the people with whom the user is interacting, and display such information on the AR headset so that the person can know how to engage appropriately. The detection is performed by an artificial intelligence (AI) algorithm that consumes the images provided by the VR headset camera and predicts the emotion corresponding to each face in the scene, displaying it visually using AR techniques. Thus, through this system, the person would be trained in the emotion recognition task thanks to the feedback from our system so that he/she can finally recognize the emotions by him/herself.

As a summary, the system we propose is deployed on a Microsoft Hololens 2, which is a powerful AR headset. This device is able to display virtual objects within the real world. It also features a range of sensors, including a high-resolution colour camera. With such sensor, we are able to retrieve images in a continuously fashion. Each image is forwarded to DLib's face detector (http://dlib.net/), which is an artificial intelligence-based algorithm. As a result, the algorithm provides the localization of each face in the input image. Then, each face is fed to a deep learning-based architecture that predicts the emotion. This architecture was trained from scratch with different datasets in order to provide accuracy and generalization capabilities. Finally, the predicted emotion is shown as an aura inside the AR world, so the user can easily identify what emotion is felt by its interlocutor.
Keywords:
Autistic persons, emotion, artificial intelligence, augmented reality.