About this paper

Appears in:
Pages: 5448-5452
Publication year: 2019
ISBN: 978-84-09-08619-1
ISSN: 2340-1079
doi: 10.21125/inted.2019.1345

Conference name: 13th International Technology, Education and Development Conference
Dates: 11-13 March, 2019
Location: Valencia, Spain

ANALIZING STUDENTS AND TEACHERS’ EMOTIONS DURING CLASS FOR IMPROVED LEARNING

It is well known that the emotions play an important role in speeches, presentations and lectures. For instance, when a lecture is exciting and full of practical approaches, the audience tends to remember more of that presentation. On the contrary, if the lecture is boring and non-appealing, the audience inevitably disengages and quickly forgets its content.

Despite the existence of techniques for creating expectation and generating positive emotions during a speech or a lecture, it is very difficult to measure if they triggered the intended emotion at the intended time. Furthermore, the problem becomes more difficult because there are external factors like the previous state of the audience and the teacher, or the content of the lecture.

In this work, we propose a software which is capable of measuring the progression of the emotions of teachers and students. The application we developed takes the feed of several cameras and computes the average emotion of the students and teacher leveraging deep learning in a live fashion Our approach will help the teachers to tune their lectures, presentations and speeches in order to be more appealing and engaging.

Setup and approach:
In order to allow the mentioned analysis of the emotions, several cameras must be placed in the classroom. There must be one camera pointing to the teacher and at least one camera pointing to the students. The stream of the cameras is sent to our software. First, a YOLOv3[1] network is used to extract the area of interest of each face in the images. Then, the faces are extracted from the source images and forwarded to EMONet, which is in charge of estimating the amount of each emotion the person is showing. So far, EMONet can provide an emotion score for the following emotions: neutral, surprised, sad, happy, fearful, disgusted and angry. Finally, the scores for each emotion for each person are averaged. This pipeline is executed for each frame of the video. In addition, the scores are plotted live and displayed to the teacher. This way, the teacher is able to modify its approach to the class in order to be more engaging. The software is also able to save the gathered data for further offline analysis.

This powerful application could help the teacher to measure the reaction of the students to a certain lecture. The teacher is able to analyze and modify the approach of that lecture for the following classes with the aim of provoking more positive emotions in the students, so they feel more engaged.
@InProceedings{MARTINEZMARTIN2019ANA,
author = {Martinez-Martin, E. and Escalona, F. and Gomez-Donoso, F. and Orts-Escolano, S. and Cazorla, M.},
title = {ANALIZING STUDENTS AND TEACHERS’ EMOTIONS DURING CLASS FOR IMPROVED LEARNING},
series = {13th International Technology, Education and Development Conference},
booktitle = {INTED2019 Proceedings},
isbn = {978-84-09-08619-1},
issn = {2340-1079},
doi = {10.21125/inted.2019.1345},
url = {http://dx.doi.org/10.21125/inted.2019.1345},
publisher = {IATED},
location = {Valencia, Spain},
month = {11-13 March, 2019},
year = {2019},
pages = {5448-5452}}
TY - CONF
AU - E. Martinez-Martin AU - F. Escalona AU - F. Gomez-Donoso AU - S. Orts-Escolano AU - M. Cazorla
TI - ANALIZING STUDENTS AND TEACHERS’ EMOTIONS DURING CLASS FOR IMPROVED LEARNING
SN - 978-84-09-08619-1/2340-1079
DO - 10.21125/inted.2019.1345
PY - 2019
Y1 - 11-13 March, 2019
CI - Valencia, Spain
JO - 13th International Technology, Education and Development Conference
JA - INTED2019 Proceedings
SP - 5448
EP - 5452
ER -
E. Martinez-Martin, F. Escalona, F. Gomez-Donoso, S. Orts-Escolano, M. Cazorla (2019) ANALIZING STUDENTS AND TEACHERS’ EMOTIONS DURING CLASS FOR IMPROVED LEARNING, INTED2019 Proceedings, pp. 5448-5452.
User:
Pass: