DIGITAL LIBRARY
ENABLING LEARNING ANALYTICS: DEFINING AND CATEGORIZING TYPES OF TASKS FOR EDUCATIONAL PURPOSES
NTNU - Norwegian University of Science and Technology (NORWAY)
About this paper:
Appears in: EDULEARN18 Proceedings
Publication year: 2018
Pages: 1408-1412
ISBN: 978-84-09-02709-5
ISSN: 2340-1117
doi: 10.21125/edulearn.2018.0443
Conference name: 10th International Conference on Education and New Learning Technologies
Dates: 2-4 July, 2018
Location: Palma, Spain
Abstract:
Learning analytics (LA) has been defined as the measurement, collection, analysis, and reporting of data about learners within a certain context, towards understanding and optimizing the environments in which it occurs. LA is a relatively new research field, and evidence is still scarce about methods and studies of impact. Learning analytics projects are said to be ‘data-rich’ and ‘theory-poor,’ indicating a need for theoretical and pedagogical perspectives to guide processes from data collection and analysis to interventions. Monitoring interaction patterns in student groups is of limited interest unless data are analyzed in light of pedagogical intent.

Important elements of learning design are a set of resources available to all students (1), support structures to assist in the provision of resources (2), and the selection of tasks that students are expected to carry out by drawing on those resources (3). This study works from the assumption that the selection of tasks and questions is the most direct way to get students engaged in learning. As an action research project, this study included the author and two professors teaching a third/fourth year science and engineering course. The paper starts out by scanning the research literature to determine a suitable problem design framework, later to be used as an analytical tool to map the distribution of types of tasks in a concrete science/engineering module.

The study applies the distinction between three types of problems; algorithmic, conceptual, and open-ended problems. While algorithmic problems contain required data and solution methods, conceptual problems also require an understanding of the underlying concept without the application of memorized procedures. Open-ended questions may be added in combination with any other type of problem to probe understanding. The careful selection and structuring of tasks is therefore vital both for analytical and remedial purposes.

Two faculty volunteered to categorize exam items based on the aforementioned framework for seven consecutive years from 2011 to 2017. Results of the analysis confirmed that most problems were either fully or mainly algorithmic (72%), while fully or mainly conceptual problems accounted only for 28% of the total. The dominance of algorithmic problems reappeared only with minor discrepancies from one year to the other. Students were good at calculations, while responses to conceptual questions uncovered learning deficiencies. These finding sparked internal discussions ensued by comprehensive revisions of the entire module.

While the construction of tasks has always been part of the academic portfolio, such undertakings have largely been drawing on experience and tacit knowledge. In this study, the task design framework was introduced retrospectively. However, conclusions are that the active use of it in planning phases is likely to facilitate subsequent data analysis. Furthermore, the framework might also encourage the promotion of constructively aligned interventions in the design of courses. After all, the purpose of learning analytics is not to collect data for its own sake, but to translate actionable recommendations to improve learning.
Keywords:
Science/engineering, learning analytics, science, types of tasks.