DIGITAL LIBRARY
DIRECT OBSERVATION AND FEEDBACK TO TEACH THE PHYSICAL EXAM TO GRADUATE MEDICAL TRAINEES
1 Johns Hopkins University School of Medicine (UNITED STATES)
2 Sinai Hospital (UNITED STATES)
3 University of Minnesota (UNITED STATES)
About this paper:
Appears in: ICERI2018 Proceedings
Publication year: 2018
Page: 1245 (abstract only)
ISBN: 978-84-09-05948-5
ISSN: 2340-1095
doi: 10.21125/iceri.2018.1285
Conference name: 11th annual International Conference of Education, Research and Innovation
Dates: 12-14 November, 2018
Location: Seville, Spain
Abstract:
Problem Identification:
In the modern hospital, physicians spend little time at the bedside. This practice has led to a decline in physical exam skills and adversely affects patient care. There are also limited tools to assess physical exam skill. Direct observation, while valuable, does not routinely occur and is qualitative and subjective. We adapted the framework of the MRCP Practical Assessment of Clinical Examination Skills in the UK (MRCP-PACES(UK)) to create a novel assessment and teaching activity for U.S. graduate medical trainees. This activity could become the basis for new Maintenance of Certification (MOC) credit for practicing physicians.

Description:
During the assessment, interns rotated through a cardiovascular, pulmonary and neurology station. At each station, two faculty preceptors examined a patient and agreed on the physical findings that were present before the start of the assessment. Interns were given 10 minutes to examine the patient, present their findings to the faculty, and answer questions about differential diagnosis and management. Interns were assessed in five areas using a 3-point scale: exam technique, identification of physical signs, differential diagnosis, clinical judgment, and maintaining patient welfare. Faculty provided 5 minutes of feedback and bedside teaching in real time. Intraclass correlation coefficient (ICC) was calculated for each station.

Results:
72 interns rotated through the cardiac and pulmonary stations. 44 of these interns also rotated through the neurology station. Nine volunteer patients and 16 faculty preceptors participated. Overall, interns did best on the pulmonary and cardiovascular stations. They did significantly worse on the neurology station. The ICC for the pulmonary station was significant at 0.77 (p<0.005). The IRR for the cardiovascular and neurology stations did not. On a post-activity survey, interns highly rated the teaching they received after the assessment, ranking it significantly higher than other forms of physical exam feedback on 5 domains. Patients remarked that it was a pleasant experience and would participate again. Faculty preceptors also enjoyed the activity and agreed to participate in future sessions. The majority of faculty preceptors thought participating as a preceptor in this activity would be a good way to obtain MOC credit. A majority of faculty would also be willing to participate as examinees for MOC credit.

Discussion:
This pilot study demonstrates the feasibility of conducting a PACES-style assessment for U.S. trainees. The poor performance on the neurologic station highlights an important opportunity to focus on bedside neurology education for medicine residents. The differences in IRR between the stations identifies a need to improve the pre-activity faculty training and orientation so that faculty have similar expectations in terms of the scoring rubric. In addition to its value as an assessment tool, the direct feedback and education delivered by the faculty preceptors was appreciated by both trainees and patients. This combination of both assessment and education could form the basis of a Maintenance of Certification (MOC) activity, and is the subject of ongoing development as part of the American Board of Medical Specialties (ABMS) Visiting Scholars Program.
Keywords:
Physical exam assessment, bedside teaching, PACES.