DIGITAL LIBRARY
LEARNING ANALYTICS AND TASK DESIGN IN SCIENCE EDUCATION
NTNU - Norwegian University of Science and Technology (NORWAY)
About this paper:
Appears in: EDULEARN17 Proceedings
Publication year: 2017
Pages: 8021-8024
ISBN: 978-84-697-3777-4
ISSN: 2340-1117
doi: 10.21125/edulearn.2017.0476
Conference name: 9th International Conference on Education and New Learning Technologies
Dates: 3-5 July, 2017
Location: Barcelona, Spain
Abstract:
This study draws on data and experiences gained during an action research study in a third-year science and engineering course with approximately 40 students enrolled. The module under scrutiny required foundational knowledge and skills in mathematics and statistics and featured a workload of 7.5 credits, or 25% of the total. Two experienced professors taught their respective parts of the course; however, despite positive feedback on lectures, failure rates remained at high levels for several consecutive years. Unfortunately, existing quality assurance procedures failed to remedy the situation, which in turn motivated a fresh intervention guided by the experience of an educational researcher.

Typically, quality assurance instruments focus on student satisfaction rather than adopting an explanatory approach to better understand poor academic achievement. This study draws on scores collected at the exam, and data analysis was guided by the following research questions: How can the analysis of exam scores help in identifying patterns of achievement in need of improvement (1); and, what is the potential of task design to help enhance learning in areas in need of improvement (2)?
Action research was used as a process of inquiry to serve those taking the action. The purpose of applying this methodology was to assist in the analysis of data, to help in theorizing outcomes and in adopting measures potentially making a difference to learning. The adopted approach was much appreciated by the involved professors since they got actively engaged in the entire research process from data collection via data analysis to the implementation of conceptual tasks to improve learning.

The construct of learning analytics is defined as the collection, analysis and reporting of performance data for purposes of understanding and optimizing learning. Exam scores were used to identify patterns of performance later to be used in theorizing explanations. Findings were that most candidates demonstrated acceptable levels of algorithmic ability while conceptual understanding was largely missing. This was theorized to be due to the nature of tasks in exercises and at the final exam, and that a redesign would better balance the weight of calculations versus conceptual questions. Finally, a mid-term exam accounting for 30 % was introduced to strengthen the conceptual approach.

In the ensuing semester, an identical research design was applied to the new data set, and a student survey was administered to gather experiences, comments and suggestions. This time the failure rate dropped dramatically, and the majority of candidates endorsed the greater variation in types of task. Several commented that the conceptual tasks extended their understanding and increased their ability to cope with complex questions and tasks. Approaches to learning changed from an emphasis on memorizing facts and equations to one in which they were also able to reason and argue.

The major lesson to be learned is that the types of tasks and questions posed to students is the most direct way of scaffolding their learning, much more so than satisfaction studies of teaching.
Keywords:
Learning Analytics, Task Design, Science Education.