DIGITAL LIBRARY
LEARNING DESIGN, LEARNING ANALYTICS, AND LEARNING MANAGEMENT SYSTEMS
Aarhus University (DENMARK)
About this paper:
Appears in: ICERI2018 Proceedings
Publication year: 2018
Pages: 2149-2154
ISBN: 978-84-09-05948-5
ISSN: 2340-1095
doi: 10.21125/iceri.2018.1474
Conference name: 11th annual International Conference of Education, Research and Innovation
Dates: 12-14 November, 2018
Location: Seville, Spain
Abstract:
Learning design and learning analytics is gaining footing in higher education as a methodology for improving teaching and learning in a systematic and sustainable way based on pedagogy theory and insights in students’ activity and achievements (Conole, 2013; Ferguson, 2012). Though these two areas are interlinked, little work has been done to combine these, that is, to link learning analytics to learning design by using collected data to inform the learning design in a systematic way. Furthermore, the support for this link is often absent in standard learning management systems (LMS) and requires the educator to make sense of data that is not directly linked to a learning design.

This paper presents an educational development methodology currently deployed at the Faculty of Science and Technology at Aarhus University for linking learning design, learning analytics, and LMS. The methodology describes:
(1) how a learning design is developed;
(2) how it is implemented in the LMS including how data is collected, analysed, and presented to the educator throughout the module; and
(3) how summative learning analytics informs the future learning design.

The learning design process starts with a comprehensive face-to-face educator workshop of typically 4–5 hours run by educational developers. The educators are presented with the underlying ideas, models, and examples of learning design. By means of Open University’s ‘Curriculum Feature cards’ (2018) the educators individually define the key qualities of their own learning design. During the following three weeks, the educators drafts their learning design, represent it by means of the LDTool (University of Wollongong, 2018), and share it with the educational developers for feedback. Based on individual supervision, the final learning design is developed and implemented in the LMS over a three-week period. A final peer feedback workshop of 2–3 hours is then organised, which is used to refine the designs and clarify any misunderstandings, upon which the module is being delivered.

During the module delivery, students’ online activity, online interaction, academic performance (e.g., quiz scores), and the utility of the LMS in terms of diversity of tools used and time flexibility is being monitored by means of a so-called ‘Barometer’. The Barometer is based on calculated columns in the Grade Centre of Blackboard Learn and by monitoring the four indicators. This allows the educator to provide aggregated or individual feedback to support students’ learning.

Upon completion of the module, the students participate in an extended version of the institutional module evaluation and the educator participates in a survey. Furthermore, additional information about the learning design, grades, and time consumption is collected. The data forms the basis of a summative learning analytics conducted by the educational developers to analyse the efforts and impacts of the different learning design and to identify factors that contribute to efficient learning design (Godsk, 2018). Designs, results, and know-how is subsequently shared with other educators to inform new designs in later workshops and later revisions of existing designs.

At the time of writing, the described methodology is under implementation at the Faculty of Science and Technology; hence, empirical evidence will be available for the final paper.
Keywords:
Learning design, technology-enhanced learning, educational development, professional development, STEM education, educational technology.