Improving Robustness and Precision in Mobile Robot Localization by Using Laser Range Finding and Monocular Vision

Improving Robustness and Precision in Mobile Robot Localization by Using Laser Range Finding and Monocular Vision

Improving Robustness and Precision in Mobile Robot Localization by Using Laser Range Finding and Monocular Vision Kai O. Arras, Nicola Tomatis Autonomous System Lab Swiss Federal Institute of Technology Lausanne (EPFL) CH–1015 Lausanne, Switzerland kai-oliver.arras@epfl.ch, nicola.tomatis@epfl.ch Abstract scene descriptions. Navigation based on geometric features allow for compact and precise environ- This paper discusses mobile robot localization by ment models. Maps of this type are furthermore means of geometric features from a laser range directly extensible with feature information from finder and a CCD camera. The features are line different sensors and thus a good choice for multi- segments from the laser scanner and vertical edges sensor navigation. This approach relies however from the camera. Emphasis is put on sensor on the existence of features which represents a models with a strong physical basis. For both limitation of environment types. sensors, uncertainties in the calibration and This is viewed as a loss of robustness which can measurement process are adequately modeled and be diminished by simultaneously employing geo- propagated through the feature extractors. This metric features from different sensors with com- yields observations with their first order covari- plementary properties. In this work we consider ance estimates which are passed to an extended navigation by means of line segments extracted Kalman filter for fusion and position estimation. from 1D range data of a 360° laser scanner and Experiments on a real platform show that vertical edges extracted from images of an opposed to the use of the laser range finder only, the embarked CCD camera. multisensor setup allows the uncertainty to stay Precise localization is important in service tasks bounded in difficult localization situations like where load stations might demand accurate dock- long corridors and contributes to an important ing maneuvers. Mail delivery is such an example reduction of uncertainty, particularly in the orien- [2]. When the task includes operation in crowded tation. The experiments further demonstrate the environments where a moving vehicle is supposed applicability of such a multisensor localization sys- to suggest reliability and predictability, precise tem in real-time on a fully autonomous robot. and thus repeatable navigation helps evoking this subjective impression. 1. Introduction The use of the Kalman filter for localization by means of line segments from range data is not new [9][11][2][7]. Vertical edges have been equally Localization in a known, unmodified environment employed [8], and propositions for specific match- belongs to the basic skills of a mobile robot. In ing strategies are available in this context [12]. In many potential service applications of mobile sys- [10], the same features were applied for approach- tems, the vehicle is operating in a structured or ing the relocation problem. The multisensor setup semi structured surrounding. This property can be was used to validate observations of both sensors exploited by using these structures as frequently before accepting them for localization. In [13], a and reliably recognizable landmarks for naviga- similar setup was used with a 3D laser sensor tion. Topological, metric or hybrid navigation simultaneously delivering range and intensity schemes make use of different types of environ- images of the scene in front of the robot. Line seg- ment features on different levels of perceptual abstraction. ments and vertical edges were also employed in a Raw data have the advantage of being as gen- recent work [14], where the localization precision eral as possible. But, with most sensors, they are of laser, monocular and trinocular vision has been credible only by processing great amounts and are separately examined and compared to ground of low informative value when looking for concise truth measurements. Proceedings of the Third European Workshop on Advanced Mobile Robots (EUROBOT ‘99), Zurich, Switzerland, Sept.6-8, 1999. 2. Sensor Modeling Kodak gray scale control patch where 10’000 read- ings were taken at each of the twenty gray levels. It is attempted to derive uncertainty models of the Opposed to the model in [1], we can observe an sensors employed with a strong physical basis. abrupt rise of noise below a certain amplitude Strictly speaking, it is necessary to trace each (fig. 1). This reduces our model for range variance source of uncertainty in the measurement process to a constant value, independent on target dis- and, with knowledge of the exact measurement tance and amplitude. −3 principle, propagate it through the sensor electron- x 10 ics up to the raw measurement the operator will 3 see. This allows for a consequent statistical treat- 2.5 ment with noise models of high fidelity which is of 2 great importance for all subsequent stages like 1.5 feature extraction and matching. 1 2.1 Laser Range Finder 0.5 0 −9 −8 −7 −6 −5 −4 −3 −2 −1 0 In all our experiments we used the commercially available Acuity AccuRange4000LR. The Acuity Figure 1: Range standard deviation (y-axis, in sensor is a compromise between building a laser meters) against measured amplitude (x-axis, in range finder by one’s own and devices like the Volts). 10’000 readings, measurements were done scanners of SICK (e.g. PLS100, LMS200). The lat- with a Kodak gray scale control patch. ter two deliver both range and angle information and come with standard interfaces. Besides the Although this analysis lead to such a simple protocol driver which is to be written, they can be result it permits rejection of false or very uncer- used practically plug-and-play. The disadvantage tain readings by means of the amplitude measure- is that this black-box character inhibits the above- ments. This is very important since in many mentioned analysis of noise sources. The practical cases the sensor exhibits strong depen- AccuRange 4000 provides range, amplitude and dency upon the surface properties like color and sensor temperature information, where amplitude roughness. Moreover, the Acuity sensor is often is the signal strength of the reflected laser beam. and reproducibly subject to outliers. When the They are available as analogue signals. laser beam hits no target at all, and at normally Relationships for range and angle variance are occurring range discontinuities it returns an arbi- sought. The accuracy limit of encoders are usually trary range value, typically accompanied by a low low with respect to the beam spot size. Angular signal strength. variability is therefore neglected. For range accu- racy there are several factors which influence the 2.2 CCD Camera extent of noise: The vision system consists in a Pulnix TM-9701 • The amplitude of the returned signal which is full frame, gray-scale, EIA (640 x 480) camera with available as measurement. an effective opening angle of 54° which sends a • Drift and fluctuations in sensor circuitry. At the standard RS-170 signal to a Bt848 based frame configured sampling frequency (1 kHz) this is grabber. No dedicated DSPs are used in our setup, predominant over thermal noise of the detection all image processing is done directly by the CPU of photodiode and resolution artifacts of the inter- the VME card. nal timers. A vision task which is intended to extract accu- • Noise injected by the AD conversion electronics. rate geometric information from a scene requires a In [1] a phase shift measurement principle has calibrated vision system. For this a variety of cam- been examined yielding a range variance to ampli- era parameters including its position and orienta- σ 2 ⁄ tude relationship of the formρ =aVr+b , where tion (extrinsic parameters), image center, scale σ2 ρ is range variance andVr the measured ampli- factor, lens focal length (intrinsic parameters) and tude. After identification, an inverse, slightly non- distortion parameters are to be determined. To cal- linear function was found. For identification in our culate a coherent set of intrinsic, extrinsic and dis- case, an experiment was performed with a station- tortion parameters the calibration technique [15] ary target at about 1 meter distance. The returned has been combined with a priori knowledge of fea- signal strength was varied systematically with a tures from a test field. The procedure is to extract Proceedings of the Third European Workshop on Advanced Mobile Robots (EUROBOT ‘99), Zurich, Switzerland, Sept.6-8, 1999. 3. Feature Extraction Geometric environment features can describe structured environments at least partially in a compact and exact way. Horizontal or vertical line segments are of high interest due to the frequent occurrence of line-like structures in man-made environments and the simplicity of their extrac- tion. More specifically, the problem of fitting a model to raw data in the least squares sense has closed form solutions if the model is a line. This is even the case when geometrically meaningful errors are minimized, e.g. the perpendicular dis- Figure 2: CCD image of the corridor where the tances from the points to the line. Already slightly experiments have been carried out (step 5 of the tra- more complex models like circles do not have this jectory). The image is compensated for radial dis- property anymore. tortion. 3.1 Laser Range Finder and fit vertical and horizontal lines from the test field and determine the distortion in the x- and y- The extraction method for line segments from 1D direction. By knowing the 3D position of these range data has been described in [3]. A short out- lines in space, the intrinsic, extrinsic and distor- line of the algorithm shall be given. tion parameters can be determined simulta- The method delivers lines and segments with neously. their first order covariance estimate using polar Due to the particularity of our application, some coordinates. The line model is simplifications for the final calibration model can ρϕαcos()––0r = (4) be done.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    9 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us