DIGITAL LIBRARY
EXPLORING THE VARIABILITY OF EXPERT CLINICAL REASONING WHEN DESIGNING A LEARNING BY CONCORDANCE TOOL
Université De Montréal (CANADA)
About this paper:
Appears in: ICERI2023 Proceedings
Publication year: 2023
Pages: 1118-1123
ISBN: 978-84-09-55942-8
ISSN: 2340-1095
doi: 10.21125/iceri.2023.0367
Conference name: 16th annual International Conference of Education, Research and Innovation
Dates: 13-15 November, 2023
Location: Seville, Spain
Abstract:
Background:
Nursing students require preparation to develop a strong clinical reasoning (CR) ability. While there are numerous educational strategies that can enhance the development of nursing students' CR, strategies addressing the uncertainty of clinical practice are rare.

One innovative strategy is the use of the online Learning by Concordance (LbC) tool. This tool involves presenting students with a brief and incomplete description of a clinical situation, along with a series of possible interventions that students must reflect upon. After a short description, an initial hypothesis is provided, followed by new information. Students must indicate whether the new information refutes, reinforces, or has no impact on the initial hypothesis. Automated feedback is provided to students, which includes the experts' reasoning behind their decisions, such as the experts' response choices for each item and a written comment explaining their choices. This feedback allows students to assess their level of concordance with the experts' choices and identify knowledge gaps. However, the variability of experts' decisions and the content of their written comments have not been extensively studied, despite their importance in providing formative feedback to students.

The objective of this study was to examine the variability of expert decisions and their written comments in an LbC tool that consisted of fifteen simulated nursing practice situations.

Methods:
A total of 21 nursing experts participated in the study and responded to an online questionnaire. They were asked to make response choices related to clinical decisions (n=45 items) and provide a written comment based on their response choice. Interrater agreement was calculated, and descriptive statistics were analyzed. Additionally, a descriptive content analysis of the experts' rational responses was conducted.

Results:
The results show that the overall interrater agreement among the nursing experts was poor, with an intraclass correlation coefficient (ICC) of 0.39 (95% confidence interval, 0.29-0.51). The analysis of the 45 items revealed that certain themes exhibited greater variability in the experts' choices, including the prioritization of care, choice of dressing for chronic wounds, and administration of medication in situations of clinical deterioration. The written comments provided by the experts heavily relied on guidelines, their clinical experience, and contextual elements in clinical practice.

Conclusions:
The LbC tool, which utilizes questioning and expert modeling, offers an innovative pedagogical approach that replicates the uncertainty of clinical practice. Such a tool is valuable when presenting learners with complex problems that lack standardized solutions. The study's findings also shed light on the domains that generate the most variability in experts' response choices. These results suggest the importance of considering the reasonable level of uncertainty in clinical situations used with students to ensure optimal pedagogical support.
Keywords:
Clinical reasoning, decision-making, nursing education, script concordance, uncertainty.