University of Graz (AUSTRIA)
About this paper:
Appears in: INTED2019 Proceedings
Publication year: 2019
Pages: 1814-1820
ISBN: 978-84-09-08619-1
ISSN: 2340-1079
doi: 10.21125/inted.2019.0519
Conference name: 13th International Technology, Education and Development Conference
Dates: 11-13 March, 2019
Location: Valencia, Spain
MOOCs (Massive Open Online Courses) are freely available online courses that have no entry barriers and are aiming at unlimited participation. Meanwhile there are several scientific contributions available that report about experiences and evaluation results of MOOC projects. These evaluations usually use questionnaires to collect personal user data like age and education, satisfaction ratings, assessments of participant’s knowledge, skills and attitudes after taking a MOOC, costs and other benchmarks regarding development efforts or completion rates and other user data that is analysed through learning analytics approaches [1]. Increasingly, MOOCs are also used to facilitate Information Literacy (IL), which refers to skills regarding finding, managing, and creating data and information. While there are already several studies that evaluate MOOC projects, there is a lack of evaluations regarding MOOCs on IL, especially in regard to perceived knowledge gain of participants.

The aim of this paper is to present a possible evaluation framework for MOOCs on IL and to showcase its practicability through findings from a first evaluation of a MOOC on IL. The suggested evaluation framework follows a three-fold approach: First, participants complete a standardized questionnaire to test their IL skills [2] before and after attending the MOOC. Second, they conduct a heuristic evaluation in groups of two to three persons and assign severity ratings [3]. Third, they document the time needed to get through the learning content. The first application of the evaluation was conduced with a sample of 10 students that have attended the MOOC as part of a class in the final year of the bachelor program for business administration at the University of Graz, Austria.

The findings show first, that the results of the test regarding IL skills improved after the students attended the MOOC. This confirms that MOOCs are a suitable approach to enhance IL skills of students. Second, the heuristic evaluation was able to show existing inconsistencies in the content and user interface as well as ratings of their perceived importance. Third, the documented learning time allowed comparisons with plans from the didactical framework. The students found the MOOC and the way how the evaluation was conducted also helpful for themselves. Thus, the evaluation approach figured out to be suitable for evaluations in the particular context. The described evaluation approach can draw avenues for future similar projects developing MOOCs on IL or related topics.

[1] J. E. Klobas, “Measuring the success of scaleable open online courses,” Performance Measurement Metric, vol. 15, no. 3, pp. 145–162, 2014.
[2] L. Beutelspacher, “Erfassung von Informationskompetenz mithilfe von Multiple-Choice-Fragebogen,” Information - Wissenschaft & Praxis, vol. 65, no. 6, pp. 341–352,, 2014.
[3] J. Nielsen, Ed., Usability inspection methods. New York: Wiley, 1994.
MOOC, Evaluation, Information Literacy.