• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-16
  • D2
  • 3
  • paper
  • Vision-aided navigation system for reusable rocket upright landing

    Paper number

    IAC-16,D2,3,10,x32761

    Author

    Dr. Shibo Gao, Beijing Aerospace Automatic Control Institute, China

    Coauthor

    Dr. Lei Gao, Beijing Aerospace Automatic Control Institute, China

    Coauthor

    Mr. Liping Xiao, Beijing Aerospace Automatic Control Institute, China

    Coauthor

    Prof. Yongmei Cheng, Northwestern Polytechnical University, China

    Coauthor

    Mr. Shun Yao, Northwestern Polytechnical University, China

    Coauthor

    Dr. Shaojun Li, Beijing Aerospace Automatic Control Institute, China

    Coauthor

    Mr. Bo Tang, Beijing Aerospace Automatic Control Institute, China

    Year

    2016

    Abstract
    At the moment, space rockets are one-shot machines. After boosting their payload to the required speed and altitude, they fall back to Earth and often break up in the atmosphere on the way. That is one reason why space flight is so expensive. Reusable rocket has been developed over a number of years to facilitate full and rapid reusability of space launch vehicles, which will provide the possibility of low cost and highly reliable access to space. One of the most important techniques in reusable rocket precise and safe return is upright landing. For reusable rocket autonomous landing, the exact position and attitude of rocket relative to the landing pad is very important for landing. In order to conquer the limitations of GPS and inertial measurement unit (IMU), a vision-based position and attitude estimation method for rocket navigation is described. The designed scheme of vision-aided precision-guided contains landing navigation camera mounted on the vehicle, cooperative targets in the landing area, an image processing module and a vision relative position/attitude solution unit. The navigation camera consists of four oblique view navigation cameras and four downward view navigation cameras, all of them are near-infrared imaging sensor. The cooperative marks are one landing cooperative mark on landing pad and four assistant cooperative marks arranged as circular distribution around the landing pad. The ground images of landing area captured by near-infrared imaging sensor are thrown into the image processing module. If the detection result is effective, the relative position and attitude is calculated using the image using vision relative position/attitude solution unit. And the solution method of vehicle’s position and attitude estimation using images is presented in detail. The preliminary experimental results on the simulated images verify that the proposed project of vision-aided precision guidance is able to achieve the navigation objectives to some extent. In the future, the issue of fusing position and attitude estimation of vision with other navigation measurements of GPS and IMU using extended Kalman filter should be studied.
    Abstract document

    IAC-16,D2,3,10,x32761.brief.pdf

    Manuscript document

    IAC-16,D2,3,10,x32761.pdf (🔒 authorized access only).

    To get the manuscript, please contact IAF Secretariat.