A Cross-Site Visual Localization Method for Yutu Rover

A Cross-Site Visual Localization Method for Yutu Rover

ISPRS Technical Commission IV Symposium on Geospatial Databases and Location Based Services, 14 – 16 May 2014, Suzhou, China, MTSTC4-2014-69 A CROSS-SITE VISUAL LOCALIZATION METHOD FOR YUTU ROVER W. Wan a, Z. Liu a, K. Di a, *, B. Wang b, J. Zhou b a State Key Laboratory of Remote Sensing Science, Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences, P.O.Box 9718, Datun Road, Chaoyang District, Beijing 100101, China - [email protected] b Beijing Aerospace Control Center, Beijing 100094, China Commission IV, WG IV/8 KEY WORDS: Planetary, Navigation, Imagery, Matching, Bundle, Accuracy ABSTRACT: Chang’e 3, which includes China’s first lunar lander and rover, was successfully soft landed on the Moon on December 14, 2013. The rover Yutu (Jade Rabbit), which has a designed life span of three months, was released to the lunar surface and started surface exploration on December 15. In spite of capability of travelling at a speed of up to 200 meters per hour, Yutu travels no more than 10 meters at a low speed between adjacent waypoints, which were called sites, due to the uncertainties on lunar surface. To satisfy the demands of safety and efficiency, optimal driving path is generated based on 3D perception of the surrounding environment, which is realized through imaging and topographic mapping with stereo cameras. Accurate localization of the rover plays an important role in path planning, offering safe guidance for obstacle avoidance and target/waypoint approaching. Traditional localization methods used in GPS-free environment, such as radio localization and dead reckoning, are applied to Yutu for position acquisition in present site. However, there are relatively large errors with localization results derived from those methods. For example, dead reckoning, relying on IMU and wheel odometry, generates rapid accumulative errors along with the increase of travelling distance. The error growth is affected by IMU drift and wheel slip caused by terrain undulation, which is up to 15% of path length in the mission. Visual localization method based on sequential images, called visual odometry, has been used in 2003 Mars Exploration Rover (MER) mission in slippery and uncertain terrains to achieve better localization results, i.e., to 3%~5%. Cross-site localization of MER rover was also performed based on bundle adjustment (BA) technique using panoramic stereo images. Yutu only captures stereo images at the waypoints without sequential imaging. Distortions of forward-looking stereo images between adjacent sites are larger than those in sequential images. New VO method, capable of deriving accurate localization results from cross-site stereo images with high degree of automation and quick turnaround time, is desirable for Yutu exploration applications. This paper proposes a new method for Yutu cross-site localization. Based on Affine-SIFT matching algorithm, the method obtains matched points in images between adjacent sites to construct cross-site geometry relationship. Then bundle adjustment is applied to resolve the exterior orientation parameters (EOPs) of the present site. In this paper, details of proposed method are introduced as four steps. Step one is image preprocessing, including image enhancement by Wallis filter, that improving the texture of images. Step two is image feature tracking between cross-site left images. Due to large differences between images of adjacent sites, image regions of interest (ROI), where overlapping areas exists, are selected according to initial EOPs provided by dead reckoning. The Affine- SIFT matching algorithm is performed to find corresponding feature points due to its advantages in coping with distortion caused by rotation, scaling and skew. Step three is feature point matching in stereo images. In order to improve matching effect, the stereo images are resampled to epipolar images with relative EOPs from camera calibration. By given points in left image, corresponding results in right image are acquired in a restricted searching area. Step four is outlier detection of matching results. A robust and simple method of outlier detection based on distance consistence in different Euclid spaces with rotation is developed to eliminate tracking errors. Consistent weights of every pair of correspondence are counted for outlier determination and elimination. The final step is bundle adjustment based motion estimation. Linear error equations, established based on the collinearity equations, are derived using Taylor’s theorem by taking the image coordinates of the tracked points as observations. In addition, the fixed relative EOPs between images in one site are added to the BA solution as constraints. Unknown parameters in the BA are resolved iteratively using the least-squares principle. In the iteration process, EOPs of images in previous site are also fixed. To analyze the theoretical accuracy of the proposed method, error equation is derived according to error propagation law. Various parameter sets are input to verify the localization accuracies in different models. In order to verify the effectiveness of the proposed method in practical applications, a number of field experiments have been conducted. Experiment results prove that the proposed method provides more accurate localization results (1%~4%) than dead-reckoning. After more validations and enhancements, the proposed method has been successfully applied in Chang’e 3 mission operations. * Corresponding author. .

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    1 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us