The annual impact of spatial disorientation (SD) on US aviation is about 6 military pilots’ lives lost and over $300 million in aircraft destroyed and property damaged, with comparable US civilian losses. Using modeling and simulation, Alion developed an approach for improving pilot spatial orientation and reducing the consequences of SD events.
Astronauts also suffer from SD, but experience some significantly different illusions, such as visual reorientation illusions, otolith tilt translation illusion, and others related to micro-gravity and multiple visual frames of reference.
Alion has developed a model-based multi-sensory approach to reducing the costs of SD. Using Micro Saint Sharp, we created an intelligent Spatial Orientation Aiding System (SOAS) for the cockpit that uses the latest research in human physiology, workload metrics, and multi-sensory countermeasures. SOAS controls visual display enhancements, and auditory and haptic cues and commands by triggering the appropriate countermeasures based upon the severity of the SD situation and the crew’s workload.
We successfully modeled common SD illusions: Leans, the most prevalent and a slow-onset event; Coriolis, a deadly, quick-onset event; and, Graveyard Spiral, an illusion that often occurs in civilian flying. The models accurately detect SD events in actual aircraft accident data received from the Air Force Safety Center, the Navy, and the NTSB. During analyses, we proved that we could model SD; account for realistic aircraft, world, and crew state variables; and, apply appropriate countermeasures based upon the situation and the crew’s workload.
In addition, we developed an SD Analysis Tool (SDAT) to help researchers and safety investigators determine if a flight profile is likely to have induced SD. SDAT users select a flight data file (from simulators or aircraft), tailor the analysis to their particular purposes, run the file, and then examine the results, as shown in the screen shots on this page.
We are also applying our approach to helping astronauts under contract to NASA-NSBRI (NCC 9-58-511). The NSBRI project’s goal is to extend Alion’s SOAS & SDAT software, originally developed for aeronautical use, to NASA applications. Both tools incorporate models of the human vestibular system and assessor heuristics to predict epoch and probability of an SD event and any other disparities between actual and perceived attitude, roll rate, or heading/yaw rate.
This four-year NSBRI project will:
1. Enhance the utility of SDAT/SOAS by including comprehensive mathematical models from MIT for vestibular and sensory cues, and for translating central nervous system gravito-inertial force resolution into perceived tilt and translation estimates. After enhancements are developed, the tools will be re-validated using the prior aeronautical data sets.
2. Extend the models to describe micro-gravity, and Shuttle and Lunar Module landing illusions. After these extensions are developed, the tools will be revalidated using Shuttle data sets and existing micro-g illusion theories (e.g., visual reorientation illusions, OTTR).
3. Expand SDAT/SOAS to consider multiple visual frames of reference (in-vehicle and out), the effects of visual attention and sensory workload, and the cognitive costs of mental rotation and reorientation. After these improvements to the tools (and the prior ones), they will be validated via flight experiments (e.g., parabolic flights).
4. SOAS will be tailored for lunar landings using multi-sensory workload and frame of reference theory to choose the appropriate SD countermeasures and their timing.
Countermeasures will include visual control command displays, synthetic and enhanced vision displays, auditory cues and commands, and haptic cues and commands.
Pilots and astronauts all stand to benefit from improvements to understanding spatial disorientation and to applying appropriate countermeasures that reduce SD-caused mishaps. Simulation and data set analyses are useful tools for examining SD events and for assessing countermeasure efficacy.
For more information about SDAT, please contact firstname.lastname@example.org.