• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-11
  • A1
  • 1
  • paper
  • Future Interface Technologies for Manned Space Missions

    Paper number

    IAC-11,A1,1,11,x11329

    Author

    Mrs. Daniela Markov-Vetter, Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR), Germany

    Coauthor

    Mrs. Anke Lehmann, University of Rostock, Germany

    Coauthor

    Prof. Oliver Staadt, University of Rostock, Germany

    Coauthor

    Dr. Uwe Mittag, Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR), Germany

    Year

    2011

    Abstract
    During a space mission the crew has to perform a wide variety of tasks under certain physical conditions. Handling of displays and control items during a mission, the astronaut’s performance is strong depended on an intuitive usability. Future interface technologies, like Augmented Reality, will provide assistance while performing technical service tasks at complex facilities. Augmented Reality (AR) interfaces enable the combination of physical world data and computer generated data, which are blended into reality in real-time. That enhances the user’s perception of the real world. Because the physical reality is a crucial part within such interfaces, the user’s interaction with the virtual objects should be intuitive and natural. 
    Current exploration and development of different AR interfaces for Standard Payload Racks on International Space Station (ISS) require research to identify common basic principle for handling virtual content in physical reality under weightlessness conditions. One basic interaction technique deals with pointing to a virtual object for selection a task. This paper explores the human’s ability to pick virtual objects in 3D space in different spatial alignment modalities of the virtual content to evaluate usability and performance in preparation for similar experiments under weightlessnes during parabolic flights.
    It is conceivable that in future manned missions AR interfaces will be applied to space operations. Beside investigation of interaction tasks, the influence of different acceleration conditions has to be considered. 
    During the study the test person wears a Head Mounted Display (HMD) seeing still the physical world. A virtual keyboard is blended into the user’s view and the user types text information one-handed. Using different modes of keyboard’s alignment allows the identification of the impact on user’s orientation. Considered these modes, the user performs three test scenarios: (1) physical alignment (virtual keyboard is displayed onto a real fixed surface), (2) HMD alignment (virtual keyboard is virtually fixed to the HMD), (3) body alignment (virtual keyboard is displayed onto user’s palm). In each scenario the subject types a given word and a series of random letters onto the virtual keyboard. Thereby the user’s performance time and error rate are measured. To indicate the influence of the field of view and the user’s presence, we perform the evaluation with monocular optical see-through HMD and binocular video see-through HMD.
    The initial study on the ground shows the usability of the metaphores under working conditions and its measured efficency as a fundament for the future study in weightlessnes.
    Abstract document

    IAC-11,A1,1,11,x11329.brief.pdf

    Manuscript document

    IAC-11,A1,1,11,x11329.pdf (🔒 authorized access only).

    To get the manuscript, please contact IAF Secretariat.