DIGITAL LIBRARY
DEVELOPMENT OF A COMPUTER-BASED ASSESSMENT TOOL FOR UNDERGRADUATE PROGRAMMING COURSES
School of Chemical Engineering, National Technical University of Athens (GREECE)
About this paper:
Appears in: ICERI2009 Proceedings
Publication year: 2009
Pages: 6990-6997
ISBN: 978-84-613-2953-3
ISSN: 2340-1095
Conference name: 2nd International Conference of Education, Research and Innovation
Dates: 16-18 November, 2009
Location: Madrid, Spain
Abstract:
With the rapid advent of new technologies, computer-based assessment (CBA) is considered to be a fast and accurate tool for assessment of students’ learning and has several advantages over the paper-and-pencil tests (PPT). In this study the score equivalence of CBA and PPT in an undergraduate course of introductory computer programming at the National Technical University of Athens’ School of Chemical Engineering, is examined. In particular, the scores of 211 students participating in an exam that consisted of questions multiple-choice type, completing and correcting code in Matlab and Fortran, as well as writing code in Fortran.

The following questions are addressed:
i. Are the overall test scores different in the CBA and PPT?
ii. Are the test scores for each type of exam different in the CBA and PPT?
iii. Is there any discrepancy of the low, medium and high scored students regarding the various types of exams of the CBA and PPT?
iv. How are the test scores correlated with time on task and number of switching answers?
v. CBA or PPT is more suitable to test students’ performance?

In order to test the efficiency of CBA, a practical examination was designed that was realized in a computer lab. The exam system is designed to be as flexible and secure as possible, equipped with the paper handling advantages. Students can review and change their answers, skip some questions and complete them later, see the remaining time and stop temporarily the test, if it is necessary. Also, they have the advantages of an open-book test since they can access the help files of the programs and use any helpful document that is stored in their private folder in the laboratory server. The evaluation of multiple choice questions is done automatically by the application. In the case of the applied exercises, the evaluation is done semi-automatically, since the application points out the mistakes, but the marking of errors is done by the grader. This technique has tangible benefits to both students and tutors. It provides easier, quicker, and more objective assessment.

In general, our results showed that students performed better and got higher scores under CBA. However, different results were recorded for each part of the test. For the first part, namely the multiple choice questions, students got comparable scores in CBA and PPT. For the second and the third part, i.e. the exercises in Fortran and Matlab, a significant difference between the students’ performance in the two versions of test was revealed. Students examined in the actual programming environment, compared to those participating in paper examination, were more effective and received higher scores. The facilities offered by CBA, which resembles the programming procedure, help students compile a code and correct any syntax errors marked by the compiler. The superior performance of the students under CBA in the second and third part of the test, explains why the CBA outperforms the PPT. Further analysis on the performance of different groups of students showed that “good” students got significantly higher scores in all parts of the test compared to the low scored students for both versions of the test. In addition, it was found that switching answers and devoting more time on task had a beneficial effect on the students’ performance. Thus, CBA can be helpful for the tutors in detecting the difficulties faced by the students and making necessary modifications.
Keywords:
computer-based assessment, computers in high education, paper-and-pencil test.