How do you train someone to deactivate a landmine, work with pathogens in a lab or triage disaster victims with head injuries? Very, very carefully.
Typically, training for hazardous, high-risk jobs is an expensive production involving dozens of individuals playing roles, supervisors taking notes and an environment that doesn’t account for unexpected issues that can arise in the chaos of actual events.
A team of technologists at management and technology consulting and engineering services firm Booz Allen Hamilton is turning to immersive virtual reality environments, powered by NVIDIA GPUs, to bring a dose of realism to hazardous job training and better prepare workers for the stress of the roles.
VR allows the Booz Allen team to replicate the numerous stimuli of a life-threatening situation, with all of the sights, noises and traumatic emotions that can come with such moments.
VR for Bang Up Jobs
Traditional hazardous duty training is expensive and limiting, according to Sandra Marshall, chief technologist at Booz Allen. “There can be high travel costs, and the ability to collect metrics is less than ideal,” she said. “There’s a real need for improvement in training for high-risk jobs.”
Marshall and Elyse Heob, lead technologist at Booz Allen, are leading a team that’s applying immersive technology to hazardous job training. Their latest effort is in medical triage for battlefield settings.
The application they’ve built exposes trainees to virtual peril, with the aim of saving the expense of assembling complex live trainings, as well as making trainees much better at their jobs.
This type of model is applicable to many more settings. Whether inside a biological lab or a field filled with unexploded landmines, virtual training has the potential to transform numerous jobs.
“It’s much easier to give routine exposure of these things in virtual reality,” Marshall said. “It’s really a life-saving tool.”
Goldmine of Data
Heob noted that training in a virtual setting triggers all of the learning systems in trainees’ bodies — emotional, cognitive, experiential and behavioral — simultaneously.
That feeds another goal of their work, which isn’t simply to better prepare people for managing stressful stimuli, but also to collect more data on how they respond to it. That data will help to refine and personalize their applications, making both the training and the trainees more resilient in the process.
Having more insight into trainees’ reactions can help trainers to gauge whether trainees’ stress levels are too high, and if they might not be ready to be in a particular environment, Heob said.
For instance, the information gathered during a virtual training might indicate that a prospective triage worker is traumatized by witnessing serious injuries, and thus might not be best suited for that role.
Reusable Training Components
To build these applications, Marshall and Heob’s team has been combining NVIDIA GPUs with an array of VR technologies — tethered VR platforms, biometric sensors, and eye and object trackers — and developing their apps on the open-platform Unity game-development engine.
In doing so, the Booz Allen team is building a library of reusable multi-player components that function similar to video games. Each component is essentially a feature — one tracks reactions times, another scores responses, etc. — that can be incorporated into any training app the team develops.
“Different industries may want similar features,” said Marshall, “so we configure those tools so they can be reused.”
Wait Until They Add AI
Eventually, Marshall envisions layering deep learning-infused AI over the apps and teaching them to learn from each training session.
“I’d like to see immersive technology radically transform the way enterprises do training so we can accumulate training data from users over time and develop more intelligent applications and informed policies,” said Marshall.
While the defense market is currently most eager to adopt immersive VR technology, Heob believes it will one day benefit any job in which people need to be prepared for emergencies or hazardous environments.
Ultimately, Marshall said, it’s all about how the training applications can address real-world training deficiencies with the data they collect. Analyzing reaction times and stress levels while tracking biometric and neurofeedback data can create targeted training to capitalize on the trainee’s strengths while improving areas of weakness.
“You’re going to have a better equipped, smarter workforce, and a safer one,” said Marshall. “And you’re going to have reduced costs for training.”