• Home
  • Current congress
  • Public Website
  • My papers
  • root
  • browse
  • IAC-06
  • B4
  • 4
  • paper
  • Feature Extraction Through Time

    Paper number

    IAC-06-B4.4.03

    Author

    Mr. Elliott Coleshill, MacDonald Detwiller Robotics, Canada

    Coauthor

    Dr. Alex Ferworn, Ryerson University, Canada

    Coauthor

    Dr. Deborah Stacey, University of Guelph, Canada

    Year

    2006

    Abstract

    Robots that perform repetitive tasks without continuous control have been used for many years in automated manufacturing installations. These systems are programmed to perform simple tasks with little or no ability to react to changes in the environment. Safety is maintained and collisions are prevented through the use of barriers, warning indicators and close control of the environment. The space environment however is extremely different than down on Earth. Consequences of a collision for a robotic system during a space flight could be catastrophic. Even a minor collision between the robot and an object within this environment could cause a system failure or loss of mission or even the death of an Astronaut.

    One approach for solving this problem is to use real-time vision data for authenticating synthetic models used in collision detection. There are many techniques for analyzing digital images, however, space based imagery is constrained by a number of technical challenges:

    • Imaging Repeatability. Subsequent scans of a camera view in space will not be able to achieve the same imaging view-point because of the lack of robot/camera positioning repeatability. Lighting Variation. On-orbit, surface appearance can change drastically due to the variation in ambient light (solar and earth light) illumination induced by orbital motion and shear lack of constrast. Object Appearance. The surface and reflectivity of an object can change due to micrometeorite damage and exposure to ultra-violet radiation and atomic-oxygen. System Constraints. Efficient computer processing is a must, given the computational limitations imposed by the need to use compact, light-weight, low-power, space-qualified computers.

    This paper addresses the lighting variation problem in order to extract key features within a scene. It is proposed that the drastic change in sunlight and shadow within an image over time can be used to filter out the extreme lighting conditions. Spacecrafts travel at a high rate of speed causing sunlight and shadow to pass over a scene in a matter of minutes. Using this motion to our advantage, the extreme sunlight can be tracked and filtered out to generate one low intensity image without the loss of image quality in pixel contrast. With a less sunlight intensive image, key objects should be able to be extracted for further analysis.

    Abstract document

    IAC-06-B4.4.03.pdf

    Manuscript document

    IAC-06-B4.4.03.pdf (🔒 authorized access only).

    To get the manuscript, please contact IAF Secretariat.