• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-14
  • A3
  • 2C
  • paper
  • Verification of Visual Navigation Performance using PANGU applied to Lunar Landing

    Paper number

    IAC-14,A3,2C,9,x25123

    Author

    Dr. Edgar Zaunick, Airbus Defence and Space, Germany

    Coauthor

    Mr. Bernard Polle, Airbus Defence and Space, France

    Year

    2014

    Abstract
    The purpose of this paper is to discuss the verification of the performances of visual navigation applied to a Lunar landing scenario using PANGU (Planet and Asteroid Natural scene Generation Utility). In autonomous navigation applications visual navigation quickly found its way as a solution to estimate a vehicle’s pose relatively to the surface by acquiring appropriate data. In this field it often became the key technology to meet requirements defined for position and attitude precision.
    
    In the early development phase it is more practical to evaluate the performances of the visual navigation system using a performance model, because it allows fast implementation, extensive parametric analyses and fast Monte Carlo analyses.
    
    The next step is to test the visual navigation system with image processing in the loop. PANGU, developed by Space Technology Centre within the University of Dundee for ESA, is a tool which is especially suitable for modelling the surfaces of planetary bodies such as Mars, the Moon, Mercury and asteroids using real and synthetic data. It has the ability to generate camera images from any position and orientation. These images can be used for simulations of planetary landing, surface roving and in-orbit rendezvous operations. In frame of the past European Lunar Lander mission and system study, PANGU has been selected to verify the performances of the visual navigation and to verify the results obtained with the performance model of the visual navigation system.
    
    This paper describes the performance verification of the feature tracking navigation, which comprises the following steps:
    \begin{itemize}
    \item PANGU is used to generate images as taken by the on-board navigation camera along the trajectory until touch down.
    \item The images are used as input to the actual visual navigation system, which is based on FEIC (Feature Extraction Integrated Circuit). FEIC processes the images by performing feature selection and feature tracking. Consequently, the real visual navigation algorithm can be applied on quasi real images.
    \item The resulting tracked features are fused in a navigation filter together with other available sensor data.
    \item Comparison of results obtained with PANGU and the performance model of the visual navigation system.
    \end{itemize}
    
    The paper focuses on the verification process and furthermore provides an overview of the Lunar Lander’s visual navigation system and the importance of visual navigation.
    Abstract document

    IAC-14,A3,2C,9,x25123.brief.pdf

    Manuscript document

    IAC-14,A3,2C,9,x25123.pdf (🔒 authorized access only).

    To get the manuscript, please contact IAF Secretariat.