DESIGNING QUESTIONS FOR ON-LINE EVALUATION OF PHYSICS IN ENGINEERING DEGREES IN A GRADED MULTILINGUAL FORMAT
The crucial point of preparing an online test is to develop good questions. This requires first of all the understanding of the objectives of assessment paying special attention to the design of the questions. A test is reliable when it really measures what the evaluator wants to notice .
Nowadays, multiple choice tests are very common and replace constructed response tests. Literature reveals that there is no consensus whether both test formats are equally suitable for measuring ability or knowledge [2, 3]. In our work we present a strategy to approximate both test formats taking advantage of each one. By using graded numerical questions we are able of rewarding partially correct answers, avoiding all-or-nothing scoring rule, or guessing strategies.
The key points of the questions can be summarized as follows:
- Numerical response. Among the possibilities of different test formats, we have chosen the numerical one because it is the most suitable to evaluate physical concepts.
- Gradually required results. In order to avoid the frustrating all-or-nothing, we have prepared questions that measure not only the final result, but the process to obtain it, evaluating the approach and intermedial responses.
- Multilingual format (Valencian, Spanish and English). The three languages are commonly used at our university courses and therefore, they should be taken into account in the assessment design.
- Structured text. We have structured the question tests into three parts: wording, data and results. Thereby we avoid possible developer mistakes when repeating data or conditions in different languages.
- Replicability. Values of the data are randomly generated in a spreadsheet. Thus, the questions are easily replicated producing similar questions to a great number of students.
Hence, educators can take advantage of low grading costs, consistent grading, and no scoring biases, while students benefit from timely feedback as well as greater coverage of the syllabus and fair grading .
 Gikandi JW, Morrow D, Davis NE. Online formative assessment in higher education: A review of the literature. Comput Educ. 2011;57:2333-2351
 Kastner M, Stangl B. Multiple Choice and Constructed Response Tests: Do Test Format and Scoring Matter? International Conference on Education and Educational Psychology 2010. 2011;12:263-273
 Siri A, Freddano M. The use of item analysis for the improvement of objective examinations. 2nd International Conference on Education and Educational Psychology 2011. 2011;29
 Hollister K K, Berenson M L. Proctored versus unproctored online exams: studying the impact of exam environment on student performance. Decision Sciences Journal of Innovative Education 2009;7(1):271-294