J.C. Callens

There is the challenge to develop an efficient learning environment for students who follow a distance education program. In literature there is a plethora of definitions and descriptions of efficiency in higher education. In general efficiency is a relative concept which is defined as the ratio of produced outputs (e.g., student success) to the used inputs (e.g. staff, resources). The main focus in this contribution is the question which assessment methodology support most the development of an efficient learning environment and is thus the most efficient methodology..
Research question: What is the relative efficiency of an assessment methodology used in distance education?
With answering this research question, there is the aim to select elements of a prototype. A prototype that can be used to implement a distance education program in higher education.

This research took place in an university college that organize 30 distance education programs (with more than 2000 students). To check which assessment methodology is most efficient, the methodology DEA (=data envelopment analysis) is used. DEA is a non-parametric common procedure to measure the degree of efficiency. The data are collected from square. Square is an online tool that registers all actions students and their lecturers make to 1) take an exam, or 2) to give feedback.

Analysis of available data reveals that a higher degree of flexibility leads to better exam results (F (2,3636) = 21,733; p = 0.00). In average more learner control for the student on the time & pace to take an exam, leads to higher exam results.

Further analysis of the available data shows that the level of prior knowledge also influences the impact that the degree of flexibility to take an exam may have on the exam result. The results reveal an interaction effect of prior knowledge with the degree of flexibility on the exam result (F (6, 56,034) = 3,374; p = 0.003). When there is more prior knowledge, the degree of flexibility is better utilized by the student. This is in line with previous research that shows that the degree of impact of learner control (inherent consequence of flexibilization) is partly influenced by a student's prior knowledge (see Daniels, 1996; Kopcha & Sullivan, 2008; Park, 1991; Scheiter & Gerjets, 2007; von Mizener & Williams, 2008).

Further, a one-way ANOVA indicates that in average exam results are higher in the condition where a students get fast feedback than in the situation where students get less rapid feedback; but this difference in results is rather small (F (2, 2.767) = 0.166; p = 0.847).

In conclusion, further guidelines for research are indicated after the discussion of the results.