About this paper

Appears in:
Page: 503 (abstract only)
Publication year: 2016
ISBN: 978-84-608-5617-7
ISSN: 2340-1079
doi: 10.21125/inted.2016.1122

Conference name: 10th International Technology, Education and Development Conference
Dates: 7-9 March, 2016
Location: Valencia, Spain

COURSE DESIGN AND EXAM RESULTS: AN EMPIRICAL ANALYSIS

M. Nettekoven

WU Vienna University of Economics and Business (AUSTRIA)
At the Vienna University of Economics and Business (WU) there are about 3000 freshmen per year, who all have to attend a set of standardized courses (the so called common body of knowledge), regardless of their study program and majors.

The course “Introductory Finance” is part of this set of courses, and due to the large number of students used to be organized as mass lecture with one final multiple choice exam at the end of the course. Typically for such course designs, students tended to start learning only at short notice before the exam, and tended to forget relevant topics soon afterwards. As we have shown in previous studies, (voluntary) homework assignments which give bonus points for the final exam have turned out to be a good means to partly counteract this bulimic learning behavior and to motivate students to keep up with classes. Still we were not satisfied with students’ learning behavior, and with the exam results and percentage of students who failed at the exam.

At the beginning of this term, we changed the course design from a mass lecture (with about 180 students per class) with one single final exam to smaller classes with 60 students and several evaluations of students’ knowledge. Now Students can get credits for collaboration in class and presentations of homework assignments (15%), at the midterm test (25%) and at the final test (60%). To pass the class, they have to get at least 60% of all available credits.

We expect that with this new course design students will perform significantly better – this expectation is also supported by a number of previous studies. However, we want to analyze our specific course and situation using both statistical methods and direct feedback from our students to evaluate the changes in our course setup and quantify the outcome.
In this paper we will concentrate on the statistical analysis of the exam results, comparing them to previous exams with the same or very similar (multiple choice) questions.

The preliminary analysis of the midterm test yields mixed results: A part of the students performed significantly better at identical problems than students in the past, whereas another part did not measure up to previous standards. Perhaps the latter group did not take the midterm test serious enough since it only contributes 25% to the overall grade.

The analysis of the final test in January and of the resulting overall credits and grades will show if this assumption is true, and will give further insight into the effects of the new course design on the exam results.
@InProceedings{NETTEKOVEN2016COU,
author = {Nettekoven, M.},
title = {COURSE DESIGN AND EXAM RESULTS: AN EMPIRICAL ANALYSIS},
series = {10th International Technology, Education and Development Conference},
booktitle = {INTED2016 Proceedings},
isbn = {978-84-608-5617-7},
issn = {2340-1079},
doi = {10.21125/inted.2016.1122},
url = {http://dx.doi.org/10.21125/inted.2016.1122},
publisher = {IATED},
location = {Valencia, Spain},
month = {7-9 March, 2016},
year = {2016},
pages = {503}}
TY - CONF
AU - M. Nettekoven
TI - COURSE DESIGN AND EXAM RESULTS: AN EMPIRICAL ANALYSIS
SN - 978-84-608-5617-7/2340-1079
DO - 10.21125/inted.2016.1122
PY - 2016
Y1 - 7-9 March, 2016
CI - Valencia, Spain
JO - 10th International Technology, Education and Development Conference
JA - INTED2016 Proceedings
SP - 503
EP - 503
ER -
M. Nettekoven (2016) COURSE DESIGN AND EXAM RESULTS: AN EMPIRICAL ANALYSIS, INTED2016 Proceedings, p. 503.
User:
Pass: