ADVANCING LEARNING TECHNOLOGIES THROUGH EMBEDDED ASSESSMENT: FOCUSING ON SIMULATION EDUCATION RESEARCH
SiTEL of MedStar Health (UNITED STATES)
About this paper:
Appears in:
EDULEARN12 Proceedings
Publication year: 2012
Pages: 5336-5341
ISBN: 978-84-695-3491-5
ISSN: 2340-1117
Conference name: 4th International Conference on Education and New Learning Technologies
Dates: 2-4 July, 2012
Location: Barcelona, Spain
Abstract:
Background:
Increasingly, healthcare education needs to blend learning across physical and virtual worlds. Yet, the current learning technology paradigm still relies predominantly on the apprenticeship model, establishing competency based on the number of performed tasks with little attention being paid to improving problem solving based on critical thinking, and the development of expertise via decision making. Medical Education Simulations (MedSims) are developed for the purposes of educational training in allied health professionals ranging from medical student and residents to clinicians refining the best practices approaches. Research efforts at SiTEL, a simulation training environments laboratory that is part of MedStar Health, that employes more than 27,000 allied health professionals, are focused on developing Embedded Assessment Architecture (EMA) to be implemented across MedSims, health games, LMS 2.0, online educational multimedia modules, and other emerging learning technologies (e.g., patient-oriented health education social media).
Project goals: Research Question
The need for an embedded assessment for learning technologies is addressed via analytic algorithms for collecting evidence-based metrics on problem solving as it related to professional expertise. The EMA “listens” for events that are meaningful to the simulation’s learning objectives. The EMA consists of variables measuring key concepts and skills; assessment tasks and scoring guides, statistical and quality control procedures, and feedback mechanisms.
Method:
Evidence-based simulation and learning gaming platforms are the result of the systematic implementation of the valid assessments linked to a specific instructional context in real-time learning. The EMA employs editable Simulation Object Templates (SOT). Each SOT provides numerous data points for scoring calculations and is linked to a) an embedded assessment configuration that guides tasks creation; b) key concepts and skills that define the training variables; c) scoring guides, linked to variables; d) statistical and quality control procedures for scoring validation through performance measures on each of the variables; and e) feedback mechanisms, including variable maps that define learner’s progress and performance.
Results/concepts:
Two types of scoring systems for EMA have been developed: application specific, or just-in-time, to provide scores immediately after the educational training (like in the fiber-optic bronchoscopy simulator and electronic fetal monitoring simulation), and post-analysis data that would be transferred to the Common Elements System (COMMONS) as both raw and indexed data to be used for cross-application analysis regardless of where exactly across participating networks the research data were generated.
Discussion & Conclusion:
Integrated learning technology platform is thought off as a hub with integrated repository that uses computer-generated (CG) scores and learner’s self-assessment to store data in near-real time in a cloud-based warehouse. Additionally, using data visualization, summary statistics and prediction models, any integrated learning tool employs algorithms to discover hidden patterns, unexpected trends or associations between the learner’s performance reports and normative (aggregate) statistical tables in graphic formats, which allows decision-makers to detect gaps and areas of improvement in learning methodology.Keywords:
Experiential learning systems, e-assessment methodology, personalized technology-enhanced learning, ambient intelligence, smart educational environments.