• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-17
  • A3
  • 3B
  • paper
  • optic flow-based navigation system for planetary rovers

    Paper number

    IAC-17,A3,3B,13,x40009

    Author

    Mr. Naoto Kobayashi, Kyushu University, Japan

    Coauthor

    Dr. Mai Bando, Kyushu University, Japan

    Coauthor

    Dr. Shinji Hokamoto, Kyushu University, Japan

    Year

    2017

    Abstract
    For planetary exploration rovers, environment recognition and avoidance of getting stuck are key technologies. Rovers must identify terrain features such as rocks, slopes, and other hazardous objects in their onboard computers for autonomous path planning. In addition, typical rovers with wheels tend to get stuck in loosely deposited soil. Thus the risk of getting stuck must be detected in real-time navigation. Current navigation systems are mainly classified into two methods: stereovision and Laser-Range-Finder (LRF). Stereovision is widely used for navigation on planets. However, it requires much computational load and thus a real-time processing by onboard computers is difficult. In LRF processing, terrain mapping can be realized by an onboard computer. However, since texture information is not obtained, LRF systems have larger risk of getting stuck than camera based navigation.
    
    In this paper, a novel optic flow-based navigation system is proposed. Optic flow, which is obtained from image sensors, is the vector field of relative velocities between a camera and surrounding objects. Although optic flow includes the information of relative distance and velocity, only either one of them can be obtained in general optic flow processing. To estimate both of them, we combine robust optic flow processing called Wide-Field-Integration (WFI) of optic flow and image segmentation technique in computer vision. WFI of optic flow is a state estimation technique utilizing the integration of optic flow obtained in image region. In the proposed method, a camera image is segmented into some meaningful regions (such as rocks and ground) based on the brightness of the ordinary image. By applying WFI of optic flow to each segmented image region, the rover’s velocity is estimated from a segment, and relative distances to objects are estimated from other segments. Thus both surrounding environment and the velocity of the rover can be estimated. Since the estimation is accomplished in linear processing, a real-time estimation is possible by onboard computers. In addition, the risk of getting stuck can be reduced from the change of slip ratio, which is evaluated from the rover’s velocity and the wheel’s rotational speed. In this presentation, the effectiveness of the proposed method is examined with numerical simulations.
    Abstract document

    IAC-17,A3,3B,13,x40009.brief.pdf

    Manuscript document

    IAC-17,A3,3B,13,x40009.pdf (🔒 authorized access only).

    To get the manuscript, please contact IAF Secretariat.