APPLYING DEBRIEFING ASSESSMENT FOR SIMULATION IN HEALTHCARE (DASH)© FOR EVALUATING DEBRIEFING COMPETENCE OF FACILITATORS AT CENTER FOR ADVANCED TRAINING IN CLINICAL SIMULATION, VIETNAM
University of Medicine and Pharmacy (VIETNAM)
About this paper:
Conference name: 15th International Technology, Education and Development Conference
Dates: 8-9 March, 2021
Location: Online Conference
Abstract:
Introduction:
Debriefing clinical simulation experiences has been identified as the most important step in clarifying and consolidating insights and lessons from simulations. At ATCS center, the facilitators use the Promoting Excellence and Reflective Learning in Simulation (PEARLS) debriefing framework and Debriefing Assessment for Simulation in Healthcare (DASH)© instrument to provide instructions and example educator behaviors to assist in evaluating and developing debriefing skills. Our study aims to assess and compare the debriefing competence of novice and well-trained educators.
Methods:
Every ATCS simulation educators must have attended at least one faculty development course including debriefing training. Six of them are simulation experts and the nine others without prior formal debriefing expertise attended a 4-hour Simulation-Based Medical Education (SBME) seminar targeted on debriefing. Each group had to debrief for the same 97 second-year medical students in formative OSCE using PEARLS debriefing script and were evaluated debriefing competence with the student-version DASH score by their respective learners. The DASH-SV is based on a seven‐point effectiveness scale and asks students to rate educators on the six elements and the behaviors associated with each element. The Wilcoxon signed-rank test was used to compare scores of two groups of educators.
Results:
Internal consistency of the questionnaire is 0.946 (Cronbach’s Alpha). A significant difference in all items of the DASH score was noted between two groups of educators. The well-trained educators scores are significantly greater than the novice ones with regard to “establishing an engaging learning environment” (6.0 ± 0.7 vs. 5.7 ± 0.8, P=0.001); “maintaining an engaging learning environment” (6.2 ± 0.8 vs. 5.8 ± 0.8, P<0.001); “structuring the debriefing in an organized way” (6.1 ± 0.8 vs. 5.7 ± 0.8, P<0.001); “provoking engaging discussions” (6.0 ± 0.7 vs. 5.6 ± 0.8, P<0.001); “identifying and exploring performance gaps” (6.0 ± 0.8 vs. 5.6 ± 0.8, P<0.001); and “helping trainees to achieve or sustain good future performance” (6.0 ± 0.8 vs. 5.6 ± 0.8, P<0.001).
Discussion:
The well-trained educators showed advanced debriefing skills. The novice educators with a short-term course had good behaviors to facilitate learning within the context of the debriefing conversation. Further training will help them in choosing and using skillfully different conversational techniques and educational strategies to maximize the impact of debriefing. The alpha coefficient for the six items is over 0.9, suggesting that the items have relatively high internal consistency. However, a high estimate of alpha may be indicating the presence of systematic errors [6]. Observational error is one of the typical causes of systematic error and maybe related to the unfamiliarity of learners with the scale and drift occurred when they checked the same scores without thinking [7].
Conclusion:
The well-trained educators have enough debriefing competence to conduct effective debriefing sessions to promote an engaging learning environment and achieve learning objectives and goals. The others need further intensive simulation training courses to comprehend and apply the principle, techniques and strategies of debriefing in their performance. Besides, briefing before scenarios should be reviewed and revised to enhance readiness, psychological safety and engagement for learners.Keywords:
Debriefing, DASH, PEARLS, ATCS, UMP HCMC, Formative OSCE, Facilitator.