• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-22
  • A3
  • IPB
  • paper
  • Hazard Detection & Avoidance Integration and Demonstration for Autonomous Moon Landing

    Paper number

    IAC-22,A3,IPB,29,x69545

    Author

    Dr. Jean-Francois Hamel, Canada, NGC Aerospace Ltd.

    Year

    2022

    Abstract
    It is generally recognised that future Moon landing platforms will require a “global access” capability. That means being able to land precisely at any location on the Moon on various types of terrains, potentially hazardous for the Lander. To do that, future Lander systems will require a Hazard Detection and Avoidance (HDA) capability. 
    
    The HDA function analyses the terrain topography to identify landing hazards (roughness, large slopes, shadowed areas). It commands the sensors (scanning Lidar and camera), processes the sensor data, reconstructs the terrain topography, generates surface hazard maps for slope, roughness and shadow, and combines this information to recommend a safe landing site meeting all the safety and Lander manoeuvrability constraints. 
    
    An important challenge for the HDA function is the need for motion compensation. The acquisition of the scanning Lidar data takes several seconds and the Lander is moving (in both attitude and translation) during this process. The Lidar measurements appear “distorted” due to the change of relative pose during the scan time. The HDA function thus relies on the outputs of the navigation system for processing the sensor measurements and to actively command the sensor to maintain the desired coverage and resolution. The integration of navigation and HDA components is key to achieve the required hazard detection reliability and to enable accurate retargeting toward the identified target.
    
    Such landing platforms typically baseline the use of optical navigation for the descent and landing operations. The validation of the integration between this optical navigation system and the HDA system is an important challenge. The validation of the interaction requires the navigation to produce flight-representative state estimation performance, which requires flight-representative image and sensor inputs.
    
    Development activities are on-going to develop a fully integrated optical navigation and HDA system composed of a scanning Lidar sensor, a camera sensor, an inertial measurement unit and an embedded processing unit where optical navigation and HDA functions are integrated and operating in real time. The paper will present the latest demonstration results of this integrated system in a full-scale dynamic environment using a UAV platform emulating Lander dynamics. It will demonstrate the integration of relative navigation and HDA and discuss overall performance compared with the required hazard detection reliability requirements and the landing accuracy requirements. The paper will conclude on the way forward for the integration of this system in upcoming mission programs.
    Abstract document

    IAC-22,A3,IPB,29,x69545.brief.pdf

    Manuscript document

    IAC-22,A3,IPB,29,x69545.pdf (🔒 authorized access only).

    To get the manuscript, please contact IAF Secretariat.