DIGITAL LIBRARY
DEVELOPMENT OF A REMOTE TEACHING SUPPORT SYSTEM FOR TRAINING HEAD AND NECK ULTRASOUND EXAMINATIONS
1 International College of Technology, Kanazawa (JAPAN)
2 Kanazawa Institute of Technology (JAPAN)
3 Kanazawa Medical University (JAPAN)
About this paper:
Appears in: EDULEARN24 Proceedings
Publication year: 2024
Pages: 1574-1581
ISBN: 978-84-09-62938-1
ISSN: 2340-1117
doi: 10.21125/edulearn.2024.0490
Conference name: 16th International Conference on Education and New Learning Technologies
Dates: 1-3 July, 2024
Location: Palma, Spain
Abstract:
At Kanazawa Medical University's otolaryngology department, they traditionally conducted face to face training sessions when demonstrating ultrasound probe techniques. However, due to the COVID-19 pandemic and a limited pool of instructors, organizing in-person practical sessions became a logistical challenge, prompting the exploration of a solution that could transition these sessions into distance learning. Hence, this research endeavors to develop a remote teaching support system that can adequately transfer the skills necessary to correctly perform ultrasound examinations as an otolaryngologist. The system we developed, captures the teacher's ultrasound probe movements using a webcam, employs artificial intelligence (machine learning) to track the location of the hand joints, and then reconstructs a virtual probe in a 3D computer graphics (3DCG) virtual space based on the information received. The virtual probe information is then transmitted to the remote learners via web real-time communication (WebRTC). The model and an anatomical diagram of the cervical region are projected onto a life-sized physical model (mannequin) using a projector or mixed reality (MR) device. By leveraging these cutting-edge technologies, this system allows teachers to demonstrate and then transmit the usages of the probe, enabling remote learners to observe and replicate the techniques.

The distinctive feature of this system lies in acquiring the teacher's probe location and orientation through the use of a single webcam. This information is critical in enabling the three-dimensional recreation of the virtual probe remotely. As a result, this system offers the advantage of adaptability being able to operate in a variety of environments without the need for special sensors. Moreover, this system is not restricted to a predetermined probe device, due to its reliance on hand position to generate the information needed for the 3D model, it can be used with a multitude of ultrasound machines with varying probe shapes and sizes.

Through this system, the professor will be able to demonstrate to each of their students how to maneuver the probe during an inspection without the need to be in the same physical space or in close proximate to one another. This system not only enables the lessons to be conducted remotely, but also allows each student to have the opportunity to experience the lesson hands.

We conducted a performance evaluation experiment of the system. Specifically, we measured the position and tilt angle values of the actual probe and the virtual probe to investigate how accurately the real probe is reproduced in the virtual space. The results showed that the reproducibility of the probe's positional information was high. However, the results from the experiment showed that the rotation exhibited errors that increased as the angle of the tilt became larger.

In conclusion, by mimicking the instructor's actions shown through the virtual probe, students can learn the practical skills needed remotely.
Keywords:
Mixed reality, ultrasound examination, distance learning, machine learning, design, hand tracking.