DIGITAL LIBRARY
UTILIZING ELECTRONIC EXAMS IN PROGRAMMING COURSES: A CASE STUDY
Tampere University of Technology (FINLAND)
About this paper:
Appears in: EDULEARN16 Proceedings
Publication year: 2016
Pages: 7155-7160
ISBN: 978-84-608-8860-4
ISSN: 2340-1117
doi: 10.21125/edulearn.2016.0560
Conference name: 8th International Conference on Education and New Learning Technologies
Dates: 4-6 July, 2016
Location: Barcelona, Spain
Abstract:
Nowadays, it is more and more common practice for university students to work during their studies, leading to problems with the scheduling of courses and exams. A common solution is to move the course material to the internet and web-based systems. However, web-based material does not solve the problem of exams, because they are usually controlled and supervised by a human being. One way to solve the problem related to exams is to utilize electronic tests, which allow flexible timetables and video-based control against cheating. However, the introduction of such arrangements is not always straightforward and might cause major changes in teaching processes. Based on a case study, this paper shows what should be taken into account when deploying electronic examinations in the setting of a programming course.

There are several different ways to organize electronic exams. In Finland, a consortium of 20 universities are using a recently developed electronic examination system called Exam [1, 2]. The system supports essay and multiple choice examinations in particular. In this study, our focus is on utilizing the Exam system in computer programming tests. Rajala [3] and Richter [4] have discussed electronic programming exams within the framework of fixed schedules. Our case study deals with organizing a programming course examination without a fixed timetable in a classroom that is controlled and monitored electronically.

The context of our study was a programming course with 50 students. The course consisted of two separate tests, which were prepared using the Exam system. The first test was conducted in two phases. The first phase aimed to get a preliminary touch about the system. It was carried out by a few students, after which some improvements were made to the examination environment and its guidance. Then, in the second phase, the rest of the students carried out the first test. For the second test, there was only one phase, after which feedback was gathered from the students using a questionnaire.

The analysis of the survey data shows that the students had positive attitudes to the electronic exam and especially to the freedom of the exam time. The first phase reveals the good opportunities of the electronic exam and further iterations confirm this assumption. The main outcomes of this study are best practices of arranging an electronic exam in an automated secured classroom. This study shows the suitability of programming examinations for the Exam system.

References:
[1] Rytkönen, A. (2014). Student Experiences on Taking Electronic Exams at the University of Helsinki: Electronic Examining at the University of Helsinki. Proceedings of EdMedia: World Conference on Educational Media and Technology 2014, pp. 2114–2121.
[2] Rytkönen, A. (2015). Enhancing feedback through electronic examining, In EDULEARN15 Proceedings, pp. 3456–3464.
[3] Rajala T., Lokkila E., Lindén R., Laakso M.J. (2015). Student Feedback about Electronic Exams in Introductory Programming Courses, EDULEARN15 Proceedings, pp. 2795-2800.
[4] Richter, T., Boehringer, D. (2014). Towards Electronic Exams in Undergraduate Engineering: A Project on Numerical Mathematics in eExams at the University of Stuttgart, (April), pp. 196–201.
Keywords:
Electronic exam, programming education, exam room, student feedback.