Alder-Mayo

In a partnership with Mayo Clinic, a team of Next Lab student workers are creating a virtual reality training simulation for medical students to practice diagnosing strokes. This proof-of-concept features hyper-realistic 3D models of medical patients that are able to be directly interacted with using VR hand-tracking controls and voice-recognition. Using the National Institutes of Health Stroke Scale (NIHSS) as a guide, learners are able to assess signs of a stroke and are graded on their accuracy.
Driven by a need for an interactive, immersive alternative to traditional video-based stroke-assessment training, ASU Next Lab’s Alder-Mayo project aims to modernize clinical education through experiential learning with virtual reality. According to experts from our partners at Mayo Clinic, existing stroke-assessment training relies on video-based observation without much interactive practice. Traditional simulation training also has limitations, as live patient actors lack reproducibility in portraying neurologic deficits, along with restricted accessibility and scheduling constraints. With this in mind, the Alder-Mayo team is developing a virtual reality platform for realistic and interactive NIHSS training, granting medical professionals and students the opportunity to develop competency in neurologic examination techniques beyond video-based learning.
Realistic Virtual Patient: Wielding the powerful capabilities of Character Creator 4, our team is able to create hyper-realistic 3D human models for interactive virtual patients. Additional integration of these models with iClone and accuLIPS software allows for high-detail facial motion capture, enabling us to grab face-tracking data from both live and pre-recorded video–even syncing lip movements to audio using AI-powered animation. This blend of 3D modeling and animation technology makes it possible to display even the most minute changes on the face, which are necessary details for a real-life stroke severity assessment.
Hand-Tracking and Voice Commands: Our team is utilizing the Unity game engine to enable players to interact with virtual patients using both hand-tracking controls found on Meta Quest and Apple Vision Pro headsets through Unity’s XR hands package, as well as offline AI voice recognition and activation through the Undertone plugin. Through these modes of interaction, the Alder-Mayo platform aims to make the virtual stroke assessment process as true-to-life as possible, as healthcare professionals will often directly grab the patient’s arm and deliver verbal instructions in order to make a diagnosis.
Accurate Assessment: With guidance from Mayo Clinic doctors, the project’s goal is to make the experience as close to real life as possible with the official National Institutes of Health Stroke Scale and realistic patient scenarios/interactions. This initial first-phase POC focuses on four main categories of the scale, with our developer team setting up the project with the ability to build out the entire scale in future iterations. Learners are graded on the accuracy of their diagnosis after the virtual examination, with statistics providing direct feedback on their performance.
Project showcase:
Launch date:
- March 2025
Creators:
- Dr. Matthew Hoerth
- Amanda Federico
- Joseph Orta
- Maximilia Hackert
- Ryan Carpenter
- Devanshi Marfatia
- Nicolette Lowery
- Marina Nasralla
- Jose Sanchez