• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-16
  • B3
  • 6-A5.3
  • paper
  • Virtual Prototyping of Human-mashine Interaction for Remote Control of Space Autonomous Manipulation Robots Based on Augmented Reality Technology

    Paper number

    IAC-16,B3,6-A5.3,7,x34571

    Author

    Dr. Alexey Karpov, Russian Federation

    Coauthor

    Prof. Mikhail Mikhaylyuk, Russian Federation

    Coauthor

    Prof. Vitali Usov, Gagarin Cosmonaut Training Center, Russian Federation

    Coauthor

    Dr. Boris I. Kryuchkov, Yu.A. Gagarin Research and Test Cosmonaut Training Center, Russian Federation

    Year

    2016

    Abstract
    The paper describes human-machine interaction of the cosmonaut with an autonomous manipulation robot (AMR) in an intra-vehicular crew activity, in particular, when solving on-board systems maintenance tasks, setting up measuring and scientific devices. To ensure safety of manned space flights it is proposed to use interactive virtual environments (IVE) for visual representation to the human-operator (H-O) of robot’s executive activity in a work zone out of the direct visual crew control. The characteristic of onboard systems and units maintenance tasks taking into account robot positionability is presented. Based on virtual prototyping, a search of methods to display data about robot positioning in a work zone on the ISS to the operator has been conducted. It is proposed to use the visualization modes facilitating H-O perception and code conversion of telemetric data from the robot as well as expeditious "engaging" the H-O in control of the AMR in the supervisory mode when detecting collisions of the AMR with objects onboard. In order to give voice commands for robot’s emergency performance in case where the H-O decides that there are prerequisites to collisions, a method of preliminary control of implementation of voice commands of the robot’s planned activity is presented. It is supposed that for predicting the results of robot’s activity the H-O needs:
    
    a)	current state of a work environment, including robot’s activity parameters;
    
    b)	predicted changes of a work environment and positioning of the AMR on the set time interval taking into account the stereotyped operations, programmed beforehand for the robot, and the results of their execution; 
    
    c)	visualized results of the execution of voice commands given to the AMR by the H-O.
    
    Thus when making such decisions, the H-O will have the visual representations of the current state of a work environment and the possible state IVE "in the future" in two variants for comparison: a) during "natural course" of events without H-O intervention; b) during a possible change of a situation at H-O’s voice commands and transition to the supervisory mode control.
    
    In this regard, the paper deals with the construction of the induced IVE that functions in real time and reproduces the prototype ergatic system behavior when setting control actions over onboard systems control elements and replacing constituent parts during maintenance.
    Abstract document

    IAC-16,B3,6-A5.3,7,x34571.brief.pdf

    Manuscript document

    (absent)