• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-20
  • B3
  • 4-B6.4
  • paper
  • The Effect of Microgravity on Visual Search Guided by Augmented Reality

    Paper number

    IAC-20,B3,4-B6.4,8,x61336

    Author

    Dr. Daniela Markov-Vetter, Germany, University of Rostock

    Year

    2020

    Abstract
    When astronauts perform operational tasks, they continuously need to search the point of interest by secondary sources, like location codes and labeled images. Augmented reality (AR) not only expands the perception of reality, it has the capability to guide the viewpoint towards a target object. The method used for viewpoint guidance determines the way spatial knowledge is acquired. Thereby methods providing survey knowledge, like an overview map, result in an allocentric orientation. In contrast, guidance through route knowledge, such as an arrow-based method registered in the real world, triggers spatial orientation in an egocentric way. In microgravity it is proven that the loss of gravity causes spatial disoreintation and that visual tasks benefit from an egocentric reference frame. However, visual search guided by an egocentric method bears the risk of attentional tunneling and to miss concurrent events, which may lead to serious consequences during spaceflight. In order to apply AR to space operations in the right way, it is required to solve the trade-off between the localization performance and the mental workload demanded by the guidance method. This paper presents a parabolic flight study in which we investigated the effect of the altered gravity on AR-guided visual search. Besides manipulating the level of gravitational force level (1g-1.8g-0g), we varied the condition of viewpoint guidance to compare the localization performance resulted from a screen-stabilized map and a world-anchored arrow method. To also identify the added value of AR applied to space tasks, we also considered the traditional guidance method using a map, but virtually fixed aside of the augmented workspace. Various control measurements were used to identify the effect of confounders, like Scopolamine and flight-condition. The experimental task corresponded to a cued search task that was interfered by a secondary reaction time task to detect cognitive overload. Both tasks, primary and secondary, demanded visual attention. In addition, we evaluated the workload level by analyzing the heart rate variability and by subjective experiences using the NASA-TLX. The results provide evidence that an arrow-based guidance leads to significantly fastest viewpoint guidance and ensures the correct localization of target objects. Because this guidance method corresponds to the fundamental characteristics of AR, we could prove the benefit of Augmented Reality in microgravity. We also showed that the arrow-based method led to significantly fastest detection of concurrent events in normogravity, while there were no differences between the guidance methods in altered gravity.
    Abstract document

    IAC-20,B3,4-B6.4,8,x61336.brief.pdf

    Manuscript document

    (absent)