How Sensors Are Literally and Figuratively Driving Mobile Robots

How Sensors Are Literally and Figuratively Driving Mobile Robots

How Sensors are Literally and Figuratively Driving Mobile Robots Timothy D Barfoot Canada Research Chair (Tier II) in Autonomous Space Robotics Associate Professor <[email protected]> ECE Workshop on Sensors for the Networked World November 21, 2014 Source: Google 2 Outline • history of sensors in mobile robotics • early mobile robots • cameras • sonar • 2D laser ranging • 3D laser ranging • examples of what on-board sensors let us do today • mapping • autonomous route following Montreal 2014 3 Robot Perception ? Action World 4 Senseless Mobile Robots Homer (Iliad, 800 BC) Hero of Alexandria (70 AD) Tesla radio-controlled boat (1898) Da Vinci (1478) 5 Light/Touch Sensors (late 1940s to early 1960s) Beast Mod II Beast Mod I Elmer and Elsie (Machina Speculatrix; 1948-9) Beast Mod II (Johns Hopkins University, Applied Physics Lab; 1961-3) 6 Cameras (1960s to 1970s) Shakey (Stanford Research Institute; 1966-72) Autonomous Stanford Cart, 1979 Stanford Cart, 1979 Moravec with Cart Shakey in ‘blocks world’ 7 early 1980s Sonar Rangefinder (1980s) H Moravec and A E Elfes, "High Resolution Maps from Wide Angle Sonar," Proceedings of the 1985 IEEE International Conference on Robotics and Hans Moravec circa 1980 with two sonar-equipped robots Automation, March, 1985, pp. 116 - 121. these classic Polaroid sonar sensors could detect range to objects in a 30-degree cone CMU evidence grid circa 1990 built from sonar data 8 2D Scanning Laser Rangefinder (1980s) DARPA ALV (CMU and SRI) 9 Cameras (mid 1980s to late 1990s) CMU’s Navlab in 1997 VaMoRs, VaMP, Vita-2 (Bundewehr University, Munich; 1995) Navlab 5 Navlab 1 Ernst Dickmanns Navlab 2 10 2D Scanning Laser Rangefinder (mid 1990s) SICK intended laser for safety Marshall et al.: Autonomous Underground Tramming for Center-Articulated Vehicles • 401 classic combination of Pioneer mobile robot and SICK laser rangefinder Figure 1. 10-t-capacity Atlas Copco ST1010c LHD, with sensor layout. obvious motivations to seek an autonomous tram- erating mines. An infrastructureless system is clearly ming (a.k.a. autotramming) solution. Yet several preferable but must easily allow for new routes to be factors make infrastructureless autotramming a added as the mine advances. challenging task. One aspect is the large inertia Some have argued strongly against the feasibility and characteristic center-articulated/hydraulically of approaches requiring precise pose estimation for actuated steering mechanism, which makes these mining vehicle automation, suggesting instead that vehicles difficult to control at high speeds (unlike “reactive” methods are preferable (Roberts, Duff, & laboratory robots that often behave approximately Corke, 2002, p. 131). However, in this paper we de- as kinematic systems). In this paper, we describe a scribe a robust localization method that draws on control architecture that effectively handles these current techniques from mobile robotics. Our tech- substantive dynamics. nique fuses laser range finder data and data from Moreover, a most notable challenge is the prob- odometric sensors to determine the position and ori- lem of precise and real-time underground local- entation of the vehicle with respect to a sequence of ization. A navigation system that requires the self-generated local metric maps of the underground installation and registration of fixed infrastructure for environment. Our approach contrasts existing sys- Rao-Blackwellized particle filterlocalization for would be costly and susceptible to the tems that either require infrastructure or often em- simultaneous localization and mappingoften harsh environmental conditions found in op- ploy topological methods and/or reactive algorithms (courtesy Cyrill Stachniss) mining automation (2006) 11 Global Positioning System (2000) US Military ended selective availability in 2000, suddenly making it accurate enough for vehicle positioning GPS constellation 12 Mars Exploration Rovers (2004) Actual tracks showing autonomous avoidance of rocks on Mars 13 1st DARPA Grand Challenge (2004) Grand Challenge Map CMU’s Sandstorm Sandstorm has a setback before the race 14 2nd DARPA Grand Challenge (2005) Beer Bottle Pass, on the 2nd Grand Challenge Stanford’s Stanley 15 DARPA Urban Grand Challenge (2007) MIT’s TALOS CMU’s Boss Stanford’s Junior 16 3D Scanning Laser Rangefinder (2007) 17 Look Ma, No Lasers! Source: Mercedes Benz 18 Q: How did mobile robots get to here? A: Has a lot to do with sensors. 19 Common Sensors stereo camera RGB-D camera 3D laser global inertial flash lidar rangefinder positioning measurement system unit omnidirectional compass 2D laser monocular camera inclinometer rangefinder camera encoder infrared sun sensor sonar switch rangefinder radar (also star tracker) 20 State-of-the-Art Mapping Zlot and Bosse, CSIRO, 2011 21 Autonomous Space Robotics Lab (ASRL) safety and security mining planetary rovers 22 Stereo Visual Odometry Pipeline Devon Island 2008 Devon Island 2008 Previous frame Right Image Keypoint de-warp and image detection rectification Nonlinear Stereo Keypoint Outlier Pose numerical matching tracking rejection estimate Left Image Keypoint solution de-warp and image rectification detection • Moravec (1980), Matthies (1987) and extended by many others 23 Teach Phase manual teach mobile robot relative map vehicle vehicle (reference) commands wheeled state chassis (the plant) motion images stereo camera estimate + (sensor) relative mapper Mistastin 2011 24 Repeat Phase autonomous repeat mobile robot relative map vehicle vehicle (reference) path tracker + commands wheeled state safety checks chassis (controller) (the plant) pose relative to path path localizer images stereo camera (state (sensor) estimator) Furgale P T and Barfoot T D. “Visual Path Following on a Manifold in Unstructured Three-Dimensional Terrain”, ICRA 2010 Kuka Service Robotics Best Paper Award Furgale P T and Barfoot T D. “Visual Teach and Repeat for Long-Range Rover Autonomy”, JFR 2010 25 Devon Island 2009 Visual Route Following (UofT, Aerospace) Montreal 2012 26 Visual Route Following (UofT, Aerospace) UTIAS 2014 Tim Barfoot [email protected] http://asrl.utias.utoronto.ca 27 Important Dates Paper Submission: Feb 13, 2015 Accept. Notification: Mar 20, 2015 Revised Papers Due: Apr 17, 2015 th Organizers 10 Conference on Field and Service Robotics (FSR) June 24-26, 2015, Toronto, Canada General Chair: Tim Barfoot (UofToronto) Program Chair: David Wettergreen (CMU) FirstCall Callfor Papersfor Papers Local Arrange.: Jon Kelly (UofToronto) We invite papers describing original work in a number of technical areas related to field and service robotics including (but not limited to): Book and Journal Special Issue Mining, Agriculture and Forestry Robotics Intelligent Cars A book version of the Conference Construction Robots Cleaning, Floor and Lawn Care Robotics Proceedings will be published by Springer- Search and Rescue Robotics Security Robotics Verlag in their STAR series and selected Autonomous Vehicles for Land, Air and Sea Medical and Healthcare Robotics field robotics papers will be considered for Planetary Exploration Robots Entertainment Robotics publication in a Special Issue of the Journal Environmental Robotics Robots for Cultural Heritage of Field Robotics. Sponsors ! fsr.utias.utoronto.ca!.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    28 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us