• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-19
  • B3
  • 5
  • paper
  • Enabling Astronaut Autonomy Through Augmented Reality

    Paper number

    IAC-19,B3,5,6,x49140

    Author

    Mr. Eswar Anandapadmanaban, United States, Massachusetts Institute of Technology (MIT)

    Coauthor

    Mr. Nicholas Anastas, United States, Massachusetts Institute of Technology (MIT)

    Coauthor

    Mr. Philip Ebben, United States

    Coauthor

    Mr. Eric Hinterman, United States, Massachusetts Institute of Technology (MIT)

    Coauthor

    Ms. Christine Joseph, United States, Massachusetts Institute of Technology (MIT)

    Coauthor

    Mr. Steven Link, United States, Massachusetts Institute of Technology (MIT)

    Coauthor

    Ms. Julia Milton, United States, Massachusetts Institute of Technology (MIT)

    Coauthor

    Mr. Barret Schlegelmilch, United States, Massachusetts Institute of Technology (MIT)

    Coauthor

    Ms. Jessica Todd, Australia, Space Generation Advisory Council (SGAC)

    Year

    2019

    Abstract
    This paper details the development and testing of a new head mounted display (HMD) for use in extravehicular activities (EVA) in performing maintenance tasks. Existing research indicates that augmented reality can be used effectively to guide users in autonomously completing on-Earth tasks (such as occupational training or maintenance) and to enable collaborative work between geographically isolated users. As human exploration of space extends beyond low Earth orbit into deep space missions, communication latency between the spacecraft and ground support will require astronauts to make critical repairs and decisions independently of mission control. Considering the complicated procedural nature of EVAs and the need for astronauts to maintain high situational awareness (SA) throughout the task, incorporating an HMD into future EVA systems can enable greater autonomy for astronauts. Moving towards this goal of increased autonomy, we need to ensure astronauts have sufficient SA of their suit systems and health, without significantly increasing the workload on the user or sacrificing task performance.
    
    An HMD was designed using the Microsoft HoloLens platform based on feedback from EVA operators and former astronauts. The HMD utilized both visual and auditory interfaces to present suit parameters while allowing for user customization of the visual layout. A set of EVA procedural tasks were presented to the user in the virtual space incorporating animations and pictorial representations of the task steps. A major design principle of the HMD was to reduce visual occlusions in the field of view.
    
    To assess workload, SA, and task performance, an analog EVA taskboard was constructed for users to perform several mock EVA tasks with and without the HoloLens platform, including rerouting of power and disabling of an alarm. The tests required users to maintain adequate SA throughout the task to detect suit parameter anomalies and modify their procedure accordingly. Preliminary tests were performed at MIT and at NASA’s Johnson Space Center in tandem with NASA’s Spacesuit User Interface Technologies for Students (SUITS) 2018 challenge. Results suggest that while using the HMD platform increases task completion time, it enables greater autonomy from ground control. This user study did not indicate a statistically significant difference in workload when using the HMD compared to the control group. Observational results indicate that virtual task animations aided in minimizing errors in task completion and reducing requests by the subjects for clarification of procedures.
    Abstract document

    IAC-19,B3,5,6,x49140.brief.pdf

    Manuscript document

    IAC-19,B3,5,6,x49140.pdf (🔒 authorized access only).

    To get the manuscript, please contact IAF Secretariat.