DIGITAL LIBRARY
UNPACKING THE INFLUENCE OF COMPUTER-BASED TESTING MODALITIES ON STUDENT STUDY BEHAVIOUR AND PERFORMANCE
University of Illinois, Urbana-Champaign (UNITED STATES)
About this paper:
Appears in: EDULEARN24 Proceedings
Publication year: 2024
Pages: 9900-9908
ISBN: 978-84-09-62938-1
ISSN: 2340-1117
doi: 10.21125/edulearn.2024.2379
Conference name: 16th International Conference on Education and New Learning Technologies
Dates: 1-3 July, 2024
Location: Palma, Spain
Abstract:
In this paper, we explore the relationship between computer-based exam modalities, study behaviors, and overall exam performance. Our research focuses on four distinct testing approaches:
(1) asynchronous, proctored exams in computer labs with institution-provided, locked-down computers;
(2) synchronous, in-class, proctored exams where students bring their own devices (BYOD);
(3) synchronous, BYOD exams taken at home with Zoom proctoring; and
(4) synchronous, unproctored, BYOD exams at home. Conducted over four semesters, from Fall 2021 to Fall 2023, the study involved a sophomore/junior-level numerical methods course with high enrollment from computer science, math, and engineering majors at a large R1 university in the United States.

The course summative assessments comprised six 50-minute exams, incorporating a variety of question types such as numeric entry, matrix input, multiple-choice, and coding. These exams were facilitated through an open-source assessment platform that generates unique question variants for each student using parameterizable item generators, enabling the reuse of these generators across semesters. All exams were auto-graded, providing immediate feedback to students.

To prepare for these exams, students were granted access to practice exams one week prior to each actual exam, which remained available until the exam date. These practice exams were hosted on the same assessment platform and utilized the same question generators as those employed for the actual exams. The platform's log data, capturing the sequence and duration of questions attempted by students, revealed variations in study behavior across the different exam modalities. Specifically, we observed an increase in study time correlating with the level of exam security, with proctored exams on locked-down institutional computers leading to the most study effort, and unproctored exams the least. Additionally, the initiation of study sessions tended to occur later for some modalities, notably unproctored exams.

Our findings delineate two distinct groups of students: those whose study habits significantly alter with exam modality, and those whose habits remain unchanged. A notable observation is that students who attain high accuracy in practice sessions generally outperform their peers in exams. Success is attributed not just to attempting a broad spectrum of questions, but to a deep understanding of concepts, which leads to solving problems correctly. Additionally, while there is a positive correlation between the total time spent on practice exercises and exam scores, the strategic utilization of study time proves more influential. Students who focus on comprehending and addressing mistakes, rather than merely increasing study volume, tend to achieve higher scores.

The timing of practice also emerges as a critical factor. Our results indicate that early and distributed practice sessions, completed more than 48 hours before the exam, correlate with superior exam performance, highlighting the benefits of spaced learning over cramming. Conversely, a negative correlation between exam performance and practice within 24 hours of the exam suggests that the cognitive load from cramming may impede the deep learning required for success in complex subjects.
Keywords:
Computer-based testing, auto-graders, mastery learning, study behaviors.