• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-18
  • A3
  • IP
  • paper
  • Finding the North on a Lunar Microrover: a Lunar Surface Environment Simulator for the Development of Vision-Based Navigation Pipelines

    Paper number

    IAC-18,A3,IP,23,x47008

    Author

    Mr. Fabian Dubois, Japan, ispace, Inc

    Coauthor

    Mr. Louis Burtz, Japan, ispace, Inc

    Coauthor

    Mr. Oriol Gásquez García, Spain, ispace, Inc

    Coauthor

    Mr. Takahiro Miki, Japan, ispace, Inc

    Year

    2018

    Abstract
    In the coming years, ispace,inc plans to deploy several microrovers on the Lunar surface. Localization systems are needed for efficient exploration of the lunar surface. Furthermore, localization and navigation are the foundation for map making, a core ispace goal for enabling in-situ resource identification and utilization. 
    
    However, the Moon lacks a global positioning infrastructure (such as GPS on Earth) and has no potent magnetic field (preventing the use of a compass). 
    Furthermore, the Lunar regolith provides a challenging environment for wheel odometry due to wheel slip. 
    Roll and pitch estimation is readily available through gravity vector sensing from an IMU, but the heading angle estimation suffers from gyroscope drift over time. 
    Therefore, rovers need to incorporate visual clues to perform localization. 
    
    Additional constraints on the architecture of the vision system result from the strong focus of the ispace platform to be the most compact and lightweight planetary rover flown to date. 
    
    This architecture (size, mass, electrical and computing power) precludes the use of terrestrial full-fledged SLAM solutions. It also requires avoiding single purpose sensors (such as a dedicated star tracker) and prevents implementing some of the strategies used in recent Mars exploration missions (such as Sun tracking via a mast-mounted panning camera and dedicated sundial). 
    
    In this restricted context, the heading estimation problem is the first topic to address to provide absolute positioning on the Moon. 
    Additionally, the knowledge of the heading angle of the rover is key to ensure that the rover complies with related mission constraints:
    \begin{itemize}
        \item power generation : orientation of the body-mounted solar panels compared to the Sun
        \item thermal : operational temperatures of the electronics
    \end{itemize}   
    
    The paper describes a vision based heading estimation pipeline using the rover's fixed cameras. We generate an image dataset through an interactive lunar environment simulation, based on the Gazebo framework, to support the development and verification. This simulation enables the generation of camera images as would be obtained from the rover in various terrain and lighting conditions. It also provides a platform for rapid iteration of the logic of the components of the vision pipeline.
    
    The results of tests performed with different estimation strategies are compared. 
    Validation in physical analogs (rover with flight model cameras and mobility systems in a lunar lighting analog environment) is presented and discussed. The paper concludes by summarizing how the vision-based navigation pipelines enables increased mission capabilities for Lunar exploration microrovers.
    Abstract document

    IAC-18,A3,IP,23,x47008.brief.pdf

    Manuscript document

    IAC-18,A3,IP,23,x47008.pdf (🔒 authorized access only).

    To get the manuscript, please contact IAF Secretariat.