1 Indiana University, Department of eLearning Design and Services (UNITED STATES)
2 Indiana University, Department of Law (UNITED STATES)
About this paper:
Appears in: EDULEARN19 Proceedings
Publication year: 2019
Pages: 8858-8863
ISBN: 978-84-09-12031-4
ISSN: 2340-1117
doi: 10.21125/edulearn.2019.2202
Conference name: 11th International Conference on Education and New Learning Technologies
Dates: 1-3 July, 2019
Location: Palma, Spain
In asynchronous online courses, student engagement is a crucial feature in students’ learning, comprehension of educational materials, and ability to apply the frameworks, theories, and tools of a profession. Theoretically, the more students engage with a course, the more equipped they will be to succeed in said course. Within online education, a physical classroom is replaced with a virtual environment in the form of learning management systems (LMS). LMS contain interactive activities, discussion boards, electronic textbooks, and video lectures. In this environment, engagement can be defined as student behaviors captured by the LMS and other technology tools (e.g., total time in a course, page views, video plays, user clicks) (Pazzaglia, Clements, Lavigne, & Stafford, 2016). Though user logs only reflect what a student clicked on and not the quality of interactions with course materials, research indicates a positive connection between these actions and course outcomes. Studies have shown that a higher frequency of logins, clicks, and time logged into a course are associated with higher course grades (Hung, Hsu, & Rice, 2012; Morris, Finnegan, & Sz-Shyan, 2005; Pazzaglia et al., 2016).

This case study will focus on the authors’ efforts to engage students in one undergraduate online course and the substantive revisions made to subsequent offerings of the course based on user data that showed low engagement. Data relied on included user access logs from the LMS (i.e., total time in course, page views, assignment completion rates, pages read in course e-text) and video play data captured by video platforms. Relevant user data for each student were exported for analysis. Descriptive statistics were applied to identify engagement trends within the two courses. User data were triangulated with student responses to mid-term and post-course survey instruments requesting feedback on the course. These surveys were created to gather feedback for the purpose of course and program-level improvement with a particular focus on increasing student engagement. Using the data collected, the authors examined the following research questions: 1) What elements of version one of the course had low student engagement as measured by student behavior captured in the learning management system (LMS) and other technology tools? 2) Did revisions to the elements identified improve student engagement in version two of the course?

Based on the preliminary findings from this study, the user data show that revisions to two key components of the course increased student engagement. Due to the expense and time commitment of online course design and development (Visser, 2000) and increased pressure to ensure high degree completion rates (Kelchen, 2018), the authors argue that using readily available user data from an LMS to identify instructional problems with courses and justify strategic revisions could be an increasingly vital strategy going forward.
Student engagement, asynchronous, online, distance learning, assessment, educational technology, pipeline course, legal education, pathways course.