A TOOL FOR PROVIDING SIMULATED PATIENTS’ FEEDBACK TO STUDENTS DURING ONLINE SIMULATION INTERVIEWS
1 The University of Sydney (AUSTRALIA)
2 The University of New South Wales (AUSTRALIA)
About this paper:
Conference name: 10th International Conference on Education and New Learning Technologies
Dates: 2-4 July, 2018
Location: Palma, Spain
Abstract:
Communication skills training (CST) is an important component of medical education. Providing medical students with opportunities to have practice interviews with simulated patients (SP) and personal feedback about the interviews is an establish method in CST. Such feedback is a crucial element of CST for students’ reflection and learning and it is often provided live by the SP or tutor at the end of the interview, or by another observer (or tutor) by reviewing a video recording of the interview. However, these methods cannot record SPs’ contemporaneous feedback, at the time of the interview. This paper proposes a feedback tool with which SPs can provide instantaneous feedback when having online simulation interviews with students, and the feedback is timestamped on the video recording of the interview. After the interview, students are able to review the feedback along with the video recording of the interview. The feedback tool comprises two emoji buttons to provide positive or negative feedback, and a comment box. In this study, we investigated the factors that might influence SPs’ feedback provision using the tool.
Year 2 medical students from an Australian medical school were asked to complete videoconference consultations with SPs on an online CST platform called EQClinic. During the consultation, EQClinic randomly selected one of the visual designs of the feedback emoji tool to display to the SP thus randomising students into two groups: thumbs and faces. Thus, if the student was from the thumbs group, EQClinic displayed a thumbs up and a thumbs down emoji button for SPs to provide positive and negative feedback, respectively; if the student was from the faces group, a sad face and a smiley face were displayed for the SP. At the end of the online consultation, SPs completed an assessment form to rate students’ communication skills and a questionnaire to report their cognitive load when using the tool.
As a result, 223 interviews were completed between 33 SP and 204 students. Overall, 68.3% of the SP used one of the two feedback tools. For both groups (faces and thumbs), SPs provided significantly more positive feedback than negative feedback to students (P<0.001). SPs provided significantly more positive (P=0.038) and less negative (P=0.047) feedback when using the faces emoji buttons than when using the thumbs emoji buttons. SPs who reported less cognitive load provided significantly more text comments feedback (P<0.01). The number of SPs’ provided positive feedback (thumbs up or smiley face) for each student was positively correlated with the students’ assessment results (P=0.012), and the number of SPs' negative feedback (thumbs down and sad face) for each students was negatively correlated with the students’ assessment results (P<0.01).
In conclusion, this feedback tool provides a novel method to record SPs’ instantaneous feedback when being interviewed by students online. This has not been achievable before now. The visual design of the tool and the cognitive load of the SP during the interviews appear to have an impact on the quality and quantity of SP feedback provision.Keywords:
Clinical consultation, medical education, video-assisted feedback, communication skills, video conferencing, timestamp recorded feedback, simulated patients.