DIGITAL LIBRARY
DOES THE QUALITY AND QUANTITY OF EXAM REVISION IMPACT ON STUDENT SATISFACTION AND PERFORMANCE IN THE EXAM ITSELF?: PERSPECTIVES FROM UNDERGRADUATE DISTANCE LEARNERS
The Open University (UNITED KINGDOM)
About this paper:
Appears in: EDULEARN16 Proceedings
Publication year: 2016
Pages: 5052-5061
ISBN: 978-84-608-8860-4
ISSN: 2340-1117
doi: 10.21125/edulearn.2016.2197
Conference name: 8th International Conference on Education and New Learning Technologies
Dates: 4-6 July, 2016
Location: Barcelona, Spain
Abstract:
This paper reports the findings of a large-scale survey in to the student experience of assessment at a distance learning university. Three key aspects of assessment were covered in the survey: formative assessment, revision for examination and the examination/end of module; with a view of providing insight for more effective learning designers and learning analytics. This analysis meets an urgent need to better understand the assessment analytics associated with the ‘revision’ period – the weeks leading up to an examination that may be crucial in ensuring student assessment success, building confidence, and improving progression to the next course.

Using results from an online questionnaire (n=281) sent to undergraduate distance learners and follow-up telephone interviews (n=13), this paper will examine some of the relationships between the revision and examination experience. Specific regard will be paid to usefulness of revision resources, time spent revising, enjoyment, reflection and learning, exam preparedness and clarity, mark satisfaction and score received. The paper will begin with an overview of the central findings of the survey the paper will focus in on the relationship between the ‘revision’ period and the examination itself with a key focus on experience, performance and self-reported learning effort.

This research represents an important step in extending the scope of assessment analytics and in better understanding the opportunities for providing more timely, targeted or personalised learning support. Key findings of the analysis reported are that revising for an exam and the exam itself are relatively distinct experiences; there is no significant correlation between time spent revising, usefulness of revision resources and module exam score; and that revision for learning, revision design and satisfaction with revision resources appear as distinct factors in the student experience. The results have implications for the design and teaching of assessment.
Keywords:
Learning Analytics, Student Experience, Examinations, Revision, Assessment Analytics, Learning Design, Student Survey, Higher Education, Distance Learning.