New academic research applies biometric measurements to flight training

A joint project between Southern Methodist University (SMU) and simulator manufacturer CAE has explored the use of biometric measurements during flight training, with a view to investigating any benefits of a personalized training regime, AINonline reports.
The novel approach combines biometrics with machine learning techniques to measure situational awareness and the cognitive load of pilots in various scenarios.
Biometric measurements include visual gaze patterns, pupil size, and heart rate to determine a pilot’s level of engagement, workload, situational awareness, stress, or fatigue.
For instance, a “poor” gaze pattern — depending on the phase of flight — could indicate a high workload, while a “correct” gaze pattern would show a higher level of attention and performance.
Similarly, fewer eye blinks or blinks of shorter duration could be correlated to tasks requiring greater attention, and heart rate variability could be used for tracking effort.
The biometric data was collected using a virtual reality headset with an integral eye-tracker and a wrist-worn device, then correlated via computer analysis.
The project is now entering its fourth year, and according to the researchers, some of the early automated biometric test results reflect in large part the assessments of highly experienced human evaluators.
“Our theory is that biometrics during the simulation will result in much more objective and accurate measurements than asking users a few questions after the simulation to measure their experience,” explains Suku Nair, director of the Center for Virtualization at SMU.
This would, in turn, make flight training potentially more personalized, effective, and efficient.
However, accelerating learning with biometric sensing is a difficult, unproven hypothesis, according to the study’s principal investigator Eric Larson.
“This research seeks to understand how sensing can be used to understand a learner’s mastery level in a difficult task, like flying an aircraft. We hope to advance the research field by being the first group to show whether personalized, automated learning can show efficacy in an actual learning scenario.”
For context, the research was originally conceived in support of a 2019 Department of Defense project focused on accelerating the training of complex skills and supporting multi-domain warfare.
In the same year, SMU and CAE (then L3Harris Technologies) first proved that machine learning based on biometric data could deliver real-time accurate performance results. L3Harris’ flight training technology business became CAE after Leidos acquired its biometrics and security business in 2020.
Fast forward to 2022, and much of the data collected as part of the project is reportedly based on a repeated measure experiment using 40 test subjects with different backgrounds and experience levels, flying a mixed reality (MR) flight simulator in a controlled environment.
SMU and CAE have demonstrated the feasibility and utility of the physiological sensor system through 33 real flight maneuvers conducted by pilots at Edwards Air Force Base.
Eye-tracking technologies have also been recently used by Microsoft as part of its Flight Simulator game.
Article Topics
biometric data | biometric sensors | biometrics | eye tracking | heartbeat biometrics | machine learning | monitoring | personalization | virtual reality | wearables
Comments