LIVE Researchers Win NSF Grant to Study Technological Support for Simulation Training in Nursing
A team of Vanderbilt researchers has been awarded a new NSF grant to create technological supports for simulation learning in Nursing. The interdisciplinary team is led by PI Daniel Levin (Psychological Sciences), and includes co-PI’s Gautam Biswas and Meiyi Ma (Computer Science), Mary-Ann Jesse (Nursing), and Alyssa Wise (Teaching and Learning). The focus of the project is to develop and test MOMENTS (MultimOdal Metacognitive EveNt-based Training for Self-regulation), an integrated system of learning applications that leverage advanced analytics that will be collected as nursing students engage in simulation learning exercises in the Nursing School’s Simulation and Skills lab. The simulation lab includes a set of realistic training environments complete with manikin robot patients, real equipment and suite of tools to create compelling mixed-reality learning experiences for students.
The Simulation lab has been a major part of the training program in the School of Nursing, but a recent call to action by Nursing School dean Pamela Jeffries highlights the need to further enhance simulation experiences and to improve assessments of student simulation performance. The MOMENTS system will address these needs by developing two learning applications, Debrief and Reflect, along with a multimodal data system and a set of information dashboards to track, analyze, and support access to learning data. The Debrief application will allow students and instructors immediate access to video data from simulations and it will support student and instructor annotation of key moments for review and discussion. The Reflect application supports a more intensive review and reflection on the events that occurred during the simulation. A key part of the system is its capability to leverage advanced multimodal data collected during simulation exercises including eye gaze (collected using cutting-edge eye tracking glasses), video, and system action logs. The MOMENTS system will use these data to help develop self-regulation skills in the context of event-based real-world learning.
A particularly exciting part of this project is that it will generate data that can be used to train AI models that relate simulation events to learning analytics. These models will help in generalizing the MOMENTS system to other domains, detecting variations in the effectiveness of real-world actions, and scaling the benefits of high-cost simulations to AR and VR environments. To help develop these real-world benefits, the team is partnering with Healthstream, the premier provider of continuing education for millions of practicing Nursing. This partnership reflects LIVE’s goals of integrating the efforts of Vanderbilt researchers in learning technology with external partners who can help this work have real-world impact.