Research Projects
ÌýActive Research Projects:
(Updated Fall 2024)
Ìý
Algorithmic Venous Gas Emboli Detection and Diagnostics for Wearable Ultrasound
Decompression sickness (DCS) is a known health and safety risk for astronauts when embarking on an extravehicular activity. DCS pathology is characterized by the presence of venous gas emboli (VGE) in the heart. Clinically, DCS is diagnosed via ultrasound examination to confirm VGE; however, these examinations are only administered after symptoms have already begun. With the recent development of the first wearable ultrasound prototypes, the ability to continuously and autonomously monitor for DCS is on the horizon. We are working to develop an algorithmic, computer vision-based means of identifying VGE in cardiac ultrasound images. This algorithm will be able to detect the incidence of DCS and classify the severity of VGE presence. The algorithm's capabilities will also be measured under degraded signal quality and under the influence of motion.Ìý
Collaborators: Dr. Karina Marshall-Goebel (NASA Human Physiology, Performance, Protection and Operations Laboratory (H-3PO))
Current Students:
PhD: Victoria Hurd
Funding: NASA Space Technology Graduate Research Opportunity (NSTGRO) 2024
Ìý
Ìý
Multi-Environment Virtual Training for Long Duration Exploration Missions
Pre-flight training is an essential part of preparing crews to perform mission critical tasks in a spaceflight environment. In long duration exploration missions, such as a journey to Mars, maintaining high-fidelity performance of these tasks may require continued training throughout transit and skill refreshers before execution on the surface. Virtual reality (VR) technology is an effective tool for training and skill development that may assist astronauts in maintaining performance of mission critical tasks. Through this project, we are developing effective VR training environments for mission critical tasks in entry descent and landing (EDL), habitat maintenance and repair, and planetary surface extravehicular activity (EVA). We are investigating skill transfer from virtual training environments to physical settings, the degree to which VR training enables performance maintenance over long durations, and how well skills learned in the training environment generalize to un-trained tasks. In addition, we are studying neural activation associated with complex mission-related task learning in VR, as training in a complex, adaptive VR environment may also serve as a countermeasure to spaceflight associated neural decrements.Ìý
Collaborators: Dr Torin Clark,ÌýAadhit GopinathÌý(Professional Research Assistant)
Current Students:
Postdocs: DrÌýPrachi Dutta
PhD: Luca Bonarrigo
Undergraduates: Matthew Bradford
Funding: NASA Human Research Program

Ìý
Ìý
Virtual Reality Design Development for xGEO Robust and Adaptive Space Domain Awareness (xRADAR)
The cislunar xGEO orbital regime is fundamentally different from traditional geocentric orbital regimes. The orbital mechanics are significantly more chaotic, and tracking objects for space domain awareness is more difficult. As secondary and post-secondary curriculum tends to focus more on the LEO-to-GEO space, very few students and recent graduates are equipped to work in the xGEO area. The development of a tool that helps operators visualize and understand the orbital mechanics in this regime would help make this topic more accessible for students. Virtual reality (VR) shows promise for this modality, but most research in this area addresses its effectiveness in training as opposed to education. Our project aims to rectify this. In this project, we are designing and testing a VR interface that allows operators to choose and manipulate orbits for optimal xGEO cislunar orbital trajectory design. We are specifically investigating what kind of displays and layouts are best for getting information across effectively, and we will be looking at how operator performance, workload, and situational awareness are affected. We will be using the results of our research to develop a set of guiding principles for the design of educational virtual reality tools and a summer bootcamp aimed at creating a network of students and professionals educated on the cislunar orbital regime. This research will inform the use of VR as an educational tool for both students and military operators.
Collaborators: Dr. Hanspeter Schaub (91¸£ÀûÉç), Dr. Shane Ross (Virgina Tech), Dr. Mirko Gamba (University of Michigan), Dr. Ilya Kolmanovsky (University of Michigan), Dr. Kevin Schroeder (Virginia Tech), Dr. Aaron Rosengren (University of California San Diego), Dr. Andrey Popov (University of Texas at Austin)
Current Students:Ìý
PhD: Jayce Cuberovic
Undergrad: Saevar Rodine, Aaron Semones
High School: Max Frew
Funding:ÌýUniversities Space Research Association (USRA) in collaboration with the United States Space Force (USSF)

Ìý
Metrics and Models for Real Time Inference and Prediction of Trust in Human-Autonomy Teaming
Human-autonomy teaming is an increasingly integral component of crewed and remotely-operated space missions.Ìý As mission duration and distance from Earth grows, humans will become increasingly reliant on their autonomous teammates.Ìý To facilitate efficient and effective collaboration, the human must appropriately trust the autonomous system to prevent misuse, overuse, and disuse.Ìý Objectively measuring trust is historically done through surveys, which are obtrusive and do not capture the dynamic nature of trust. Furthermore, previous work has captured trust as a one-dimensional construct when it is multidimensional in nature.Ìý Cognitive trust (CT) forms due to rational, logical thinking and can be affected by the performance or reliability of the system. Affective Trust (AT) is based on feelings and emotions and can be affected by the autonomous system's rhetoric. Biosignals, such as heart rate variability metrics, oxygenated hemoglobin, and skin conductance responses, as well as embedded measures (e.g., button clicks) have been shown to be promising indicators of trust. Furthermore, continuous physiological and behavioral monitoring is unobtrusive and can be used in models that allow trust dynamics to be predicted in real-time.Ìý This work aims to collect biosignals and embedded measures of human participants through numerous physiological signal modalities while they interact with a simulated autonomous system teammate. Our goal is to model subjects' reported trust using their physiological responses and embedded measures.Ìý This will inform the development of metrics and models that can infer and predict trust.Ìý
Current Students:
PhD: Sarah Leary****
Masters: Abby Rindfuss
Funding: Air Force Office of Scientific Research (AFOSR)
Ìý

Ìý
G-Induced Loss of Consciousness Time-Prediction Modeling
The occurrence of G-Induced Loss of Consciousness (GLOC) has been historically unpredictable. We are working to provide the foundation for an in-cockpit GLOC monitoring system by using collected physiological data (EEG, fNIRS, ECG, respiration, and eye-tracking) to predict when GLOC occurred in a controlled centrifuge environment. The aims of this project are threefold: 1) investigate and predict the time-course of GLOC events, 2) investigate which signals are the best predictors of GLOC to minimize the complexity of future sensor systems, and 3) to assess how these time-course models change subject to subject. This project is part of Physiologic Sensing: Maximizing Operational Value and Executability (psMOVE).
Collaborators: Chris Dooley (Air Force Research Lab)
Post Docs: Dr Aaron Allred
Current Students:Ìý
PhD: Nicole Rote
Funding: Air Force Research Lab

Ìý
Cognitive Security Multi-University Research Initiative
Cognitive security refers to protecting humans from information-based threats that aim to disrupt cognitive processes such as reasoning and decision making. Cognitive security is particularly difficult to disentangle when we consider the complex (and understudied) ways that the information density spectrum affects decision-making. For example, the unique cognitive security challenges posed by low-information density environments such as space and the arctic are likely to be very different from high-information density environments.ÌýÌýOur goal is to support humans to maintain cognitive security across a range of information density environments in a variety of operational environments. First, we are completingÌýhuman-centered interviews and focus groups with subject matter experts who have unique experiences in the cognitive security domain, in both information sparse, and information rich environments. The outcomes of these interviews will inform lab-based neurophysiological studies and field testing in Martian analog environments.

High level objectives of Cognitive Security Multi-University Research Initiative.
Virtual Reality Operations and Training for the Remote Supervision of Satellites
Remote supervision and monitoring of autonomous systems is an important modality for future operations including spaceflight, manufacturing, and transportation. However, this presents many challenges for the operator, which may be reduced through proper interfaces and display design. Virtual reality (VR) shows promise, but it is unclear if VR is useful for future supervisory paradigms that will involve monitoring systems and sending intermittent commands rather than directly controlling them. In this project, we are developing a VR display to aid in satellite operations. We are investigating how the operator's performance, situation awareness, and workload compare between immersive VR displays, 2D displays with visualizations, and traditional 2D displays without visualizations. In addition, we are investigating how training in VR can influence performance in operations, even if the operations are done using traditional displays. This research will inform the use of VR as a display and training modality for remote supervision of autonomous systems.
Collaborators: Dr. Hanspeter SchaubÌý
Current Students:Ìý
PhD: Savannah Buchner
Funding:ÌýSpaceWERX STTR in collaboration with Gridraster
Ìý

Subject acting as remote satellite operator in virtual reality.
Ìý
Skill Retention using AI-Assisted Point-of-Care Ultrasound in Novice, Technically Competent Users
Point-of-care ultrasonography (POCUS) is a clinical tool that has been widely used for the diagnosis and monitoring of many acute medical conditions in hospital settings. While maintaining the ability to capture high quality images, POCUS remains a lightweight, portable, and low cost imaging modality. These characteristics are distinctly necessary to be effective in austere environments where resources are limited. A significant challenge associated with POCUS is the operator’s skill level and its degradation over time. Previous studies have focused on initial ultrasonography training and ability, but have not assessed skill decay and the temporal effects of training with artificial intelligence assistance. Providing data on this subject will be critical to understandingÌý the utility of POCUS in remote settings, such as human spaceflight. The results of the research will inform POCUS training regimens for future operational medical assessments relevant to the military, human spaceflight, and austere medicine.Ìý
Ìý
Collaborators: Dr. Matthew Riscinti (Denver Health, University of Colorado - Anschutz), Dr. Michael Del Valle (Denver Health), Dr. Arian Anderson (University of Colorado - Anschutz, NASA Exploration Medical Capabilities), Dr. Mike Heffler (Denver Health), Dr. William Mundo (Denver Health)
Current Students:Ìý
PhD: Victoria Hurd, Victoria Kravets *
Ìý

Ìý
Ìý
Real-time Unobtrusive Monitoring of Trust, Workload, and Situation Awareness through Psychophysiological and Embedded Measures
As humans venture farther from Earth, the spaces they inhabit will increasingly rely on autonomous systems to keep them alive, happy, healthy, and productive. Current mission architectures, such as that of the ISS, are able to rely heavily on frequent resupply missions and near-constant ground support; however, next-generation architectures will require efficient teaming between human crews and largely autonomous habitats. An intimate understanding of factors like crew trust in autonomy, workload, and situation awareness (TWSA) will lay the groundwork for robust deep-space human-habitat teaming. Current gold-standard measures of TWSA are often very subjective in nature and require the administering of obtrusive questionnaires. Additionally, humans’ psychophysiological responses have long been studied as a proxy for TWSA, yet much of this work has been constrained to the laboratory environment and has tended to monitor only one facet of TWSA at a time. Our research aims to develop a novel methodology for monitoring and discriminating TWSA cognitive states, given a lean and unobtrusive psychophysiological data stream. We have begun this process by developing an immersive and contextualized piloting task inside our HL-20 DreamChaser mockup, where we can manipulate and measure TWSA. Psychophysiological measures collected include electrocardiogram (ECG), respiration rate, electrodermal analysis (EDA), and eye tracking. Moreover, we have developed embedded measures that leverage specific aspects of task performance to infer TWSA cognitive states. These techniques have larger implications for the field of human-computer interaction as a whole and will be increasingly relevant for aerospace and more as autonomous systems become more ubiquitous.
Ìý
Collaborators: Dr. To