DIGITAL LIBRARY
COURSE CONSISTENCY IN AN ONLINE GRADUATE NURSING PROGRAM
Clarkson College (UNITED STATES)
About this paper:
Appears in: INTED2022 Proceedings
Publication year: 2022
Pages: 93-97
ISBN: 978-84-09-37758-9
ISSN: 2340-1079
doi: 10.21125/inted.2022.0076
Conference name: 16th International Technology, Education and Development Conference
Dates: 7-8 March, 2022
Location: Online Conference
Abstract:
End of course student evaluations provide program faculty with a wealth of information, including aspects of the course that worked for the student and others that did not. When used in combination with Graduate Exit Surveys, faculty can enhance course delivery, increase course consistency across the curriculum, and, in general, provide a product for the end-user, i.e., learners and other faculty members, that is easy to use and enables the learner to easily navigate the new course, spending cognitive load on new information rather than figuring out where things are in the new course.

In 2020, members of the administration team reviewed all Graduate Nursing courses for all program options (e.g., doctor of nursing practice courses, core master’s-level courses, nurse practitioner, nurse educator, and healthcare administrator courses). The Course Review Checklist was created for this process using a combination of:
(a) the Online Course Evaluation Rubric (OCER) used by the College Center for Teaching Excellence (CTE) to evaluate all College courses, and
(b) best practices in online education/teaching and learning, harvested from a review of the literature.

The focus of the review was not on content; this fact was made very clear to all faculty at the beginning, throughout, and at the end of the initiative. The focus on the review was on best practices for online education, such as consistency across courses in page layout, orientation, and other logistical things.
A timeline was established for all courses wherein a prescribed number of milestone tasks were required. These tasks related to the module competencies, course navigation, exceedingly long pages (requiring the learner to scroll down to view all content), direct links, module overviews and competencies, objective rubrics, providing printable directions, and ensuring all online/video lectures were in the appropriate format.

The time established began in October 2020 and ended immediately before the fall semester began in 2021, with milestone tasks due each month based on the date of the initial meeting with course faculty. A total of 170 milestone tasks were identified across 38 courses. A Master-CR (i.e., Course Review) course shell was created in Canvas for all courses and faculty were directed to begin the course review process by importing the current version of the course to that master shell. It would be in these master course shells that program administration verified that milestone tasks had been completed.

By the final deadline, the day before fall classes began in August 2021, 99.14% of required changes had been made, with the remaining 0.86% completed within a week of the start of term. We will have to wait for the next round of end of course evaluations for the courses to determine the efficacy of the course review process. In the short analysis, while there was indeed resistance from some faculty members, for the most part, faculty were grateful for the opportunity to work with others to brainstorm about ideas for enhancing their courses.
Keywords:
Online course, quality assurance, quality improvement.