DIGITAL LIBRARY
AN EXPERIENCE WITH AN ONLINE ASSESSMENT SYSTEM WITH PERSONALIZED EXAMS FOR EACH STUDENT
Universidad Nacional Autónoma de México (MEXICO)
About this paper:
Appears in: ICERI2021 Proceedings
Publication year: 2021
Pages: 4413-4418
ISBN: 978-84-09-34549-6
ISSN: 2340-1095
doi: 10.21125/iceri.2021.1017
Conference name: 14th annual International Conference of Education, Research and Innovation
Dates: 8-9 November, 2021
Location: Online Conference
Abstract:
The migration to 100% online educational systems derived from the COVID-19 pandemic requires the development of evaluation methodologies that capture the student's performance in the best possible way.

In Mexico, the dynamics of social distancing imposed the closure of educational facilities throughout the country, so education has become 100% online with classes through videoconference (Zoom) and application of exams at home through of digital platforms (Moodle). This situation began on April 21, 2020 when the National Healthy Distance Day was declared and has remained to date. Which unfortunately implies that the application of exams is carried out without the face-to-face supervision of the teacher to avoid the possibility of academic dishonesty of the students, who also currently have many communication tools (WhatsApp, Facebook, etc.) between them to help each other. Therefore, the continuous monitoring of the academic performance and effort of the students can be compromised. In these circumstances and in order to control in the best possible way and force the student's effort, E-Learning has been used, which is part of the group of strategies and practical measures with a view to evaluating students during this pandemic. and encourage lifelong learning online.

This work summarizes the experience with an automatic generation and evaluation system of personalized exams for each student, based on exercises in LaTeX format (a language that facilitates the application of formatting to a text) that include R code for the dynamic generation of exercises. Which presents the student with fully customizable exams, so they solve totally different exams. Each exercise is a single file in Sweave format (.Rnw) interlaced with R code to generate the data and LaTeX code to describe the question and the solution (Zeileis 2014). Through this file (.Rnw) N number of unique exercises are produced with particular data for each exercise. In this way, the intention of dishonesty that students may have when taking the exam at home is controlled.

The sample is made up of students from the Bachelor of Industrial Engineering and the Bachelor of Mechanical Electrical Engineering who are studying (both) the subject of Probability and Statistics at the Cuautilán Faculty of Higher Studies. A first exam was carried out in which the questions were not personalized and were presented to the student randomly. The second step was to evaluate the same topics as in the first exam, but this time the student was presented with personalized and different questions in a random manner. As a conclusion for this group, a clear difference in the levels obtained by the same group before and after the automated evaluation system can be observed through the students' grades. One of the advantages of exercises being single files is that maintenance teams can work on the set of exercises in a multiplatform, multi-author environment because each team member can independently develop and edit a single exercise. Subsequently, the same system has been applied to evaluate more than 10 groups with approximately 40 students each and it has been observed that among the components that influence education by E-Learning in addition to social factors (COVID-19) there are technical factors such as System Quality and Internet Quality that have a significant effect on the effectiveness of online learning systems (Selim 2007).
Keywords:
Automated Evaluation, Moodle, LaTeX, R Code.