• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-08
  • C1
  • 5
  • paper
  • A Real Time Implementation of a Passive Optical Terrain Navigation System

    Paper number

    IAC-08.C1.5.5

    Author

    Mr. Doug Reid, Johns Hopkins University Applied Physics Lab, United States

    Coauthor

    Mr. Adrian Hill, Conicet and University of Buenos Aires, United States

    Coauthor

    Mr. Jonathan Green, Johns Hopkins University Applied Physics Lab, United States

    Coauthor

    Mr. William Innanen, Johns Hopkins University Applied Physics Laboratory, United States

    Year

    2008

    Abstract
    This paper describes the software prototyping and development efforts of a Real Time “Passive Optical Terrain Navigation” (TRN) algorithm that one day may be a major component in a crewed, or unmanned, Lander that will have the capability to land precisely, and safely, anywhere and in almost any lighting conditions, on either the moon, an asteroid, Mars, or, indeed, on any planetary body. The TRN algorithm that is currently being developed as a real time, flight qualified, embedded application, is enabled for about twenty minutes as the Lander flies over the terrain, descending from approximately one hundred kilometers  to about four kilometers in altitude. During the descent, two optical cameras, one on the right and the other on the left side of the Lander, and with their fields of view overlapping, image the terrain as it sweeps past. This image data, captured at about a rate of two images per second, is compared against an on-board “map”. The differences between where the algorithm thinks it is and where in actuality it is, and the fact there are two images, makes it possible to determine the longitude, latitude, and altitude of the Lander relative to the surface of the planet. This solution is passed to the “Navigation” portion of landing algorithm which presumably knows where it’s going and uses the information to help it on its way. 
    
    There are three main components that are discussed – the TRN Algorithm itself, a real time Camera Emulation which provides simulated so-called “truth” images as part of the testing and validation effort, and the Digital Elevation Map (DEM) Manager which manages the on-board data base of terrain maps. Of particular note are the two main challenges the development team faced - first, the sheer lack of CPU horsepower; and, second, the excessive amounts of memory required by the DEM.. With the processors likely to be qualified to fly on such missions, there exists a CPU bottleneck in the ability to capture the image, render a comparative map from the DEM database, and perform a correlation between the two images  - and then do it again for the second camera  and all over a  period of approximately one second.  And even though the individual images and their rendered counterparts do not require excessive amounts of memory, the trajectory, contained within the DEM, does: depending on the trajectory, estimates  range from two to four gigabytes  – a lot of memory for an embedded flight qualified application. 
    
    This paper provides a description of each main component of the design accompanied by a brief description of the basic theory. The emphasis, however, is on the real time design and software layout and construction of the TRN and Camera Emulation. Basic test results and comparative images will be shown, including preliminary benchmark results.  Finally, a description of a plan to incorporate hardware accelerated rendering will be described. 
    
    Abstract document

    IAC-08.C1.5.5.pdf

    Manuscript document

    IAC-08.C1.5.5.pdf (🔒 authorized access only).

    To get the manuscript, please contact IAF Secretariat.