DIGITAL LIBRARY
DEVELOPMENT OF A SOFTWARE SYSTEM FOR AUTOMATED TEST ASSEMBLY AND SCORING
University of Belgrade, School of Electrical Engineering (SERBIA)
About this paper:
Appears in: ICERI2010 Proceedings
Publication year: 2010
Pages: 6012-6016
ISBN: 978-84-614-2439-9
ISSN: 2340-1095
Conference name: 3rd International Conference of Education, Research and Innovation
Dates: 15-17 November, 2010
Location: Madrid, Spain
Abstract:
Composing student assessments and tests is demanding and time-consuming, especially in technical disciplines, where tests tend to replace problem-oriented tasks. Many commercial and proprietary toolkits were developed to help teachers in these activities, which range from optical recognition scoring systems for standardized test forms, to specific software tools for test assembly based on random choice of questions from a database. In this paper we present the development of a software system "test" designed to help in frequent evaluations of knowledge for large groups with hundreds of students, performed by written tests. The system was initially designed and used for more than a decade in various implementations and technologies, for making and scoring tests for procedural programming courses that authors teach. Through its development, a generalized conceptual model of the test with heterogeneous elements was designed, and supported by implementation of a class library testCore which represents a programming core of the "test" system. Functionality of the system includes definitions of multiple interdependent criteria for intelligent test composition performed by testGen component, as well as search and maintenance of tests, problems, and statistical analysis of students' results. The component testScore is designed to provide scoring methods on individual and group basis. Previously developed components testArs for optical recognition of scanned test forms and testMix for generation of multiple versions of the same test by changing orders of questions and /or answers, were integrated into the system. Tests can be exported to SCORM format, or to Moodle, in order to provide students with interactive environment for knowledge testing. Possible improvements of the system include the use of genetic algorithms in automated test assembly, as well as introduction of parametrized problems, which can be especially convenient in the area of programming.
Keywords:
Test composing, test scoring, criteria definition, SCORM.