How Sensors Are Literally and Figuratively Driving Mobile Robots

Total Page:16

File Type:pdf, Size:1020Kb

How Sensors Are Literally and Figuratively Driving Mobile Robots How Sensors are Literally and Figuratively Driving Mobile Robots Timothy D Barfoot Canada Research Chair (Tier II) in Autonomous Space Robotics Associate Professor <[email protected]> ECE Workshop on Sensors for the Networked World November 21, 2014 Source: Google 2 Outline • history of sensors in mobile robotics • early mobile robots • cameras • sonar • 2D laser ranging • 3D laser ranging • examples of what on-board sensors let us do today • mapping • autonomous route following Montreal 2014 3 Robot Perception ? Action World 4 Senseless Mobile Robots Homer (Iliad, 800 BC) Hero of Alexandria (70 AD) Tesla radio-controlled boat (1898) Da Vinci (1478) 5 Light/Touch Sensors (late 1940s to early 1960s) Beast Mod II Beast Mod I Elmer and Elsie (Machina Speculatrix; 1948-9) Beast Mod II (Johns Hopkins University, Applied Physics Lab; 1961-3) 6 Cameras (1960s to 1970s) Shakey (Stanford Research Institute; 1966-72) Autonomous Stanford Cart, 1979 Stanford Cart, 1979 Moravec with Cart Shakey in ‘blocks world’ 7 early 1980s Sonar Rangefinder (1980s) H Moravec and A E Elfes, "High Resolution Maps from Wide Angle Sonar," Proceedings of the 1985 IEEE International Conference on Robotics and Hans Moravec circa 1980 with two sonar-equipped robots Automation, March, 1985, pp. 116 - 121. these classic Polaroid sonar sensors could detect range to objects in a 30-degree cone CMU evidence grid circa 1990 built from sonar data 8 2D Scanning Laser Rangefinder (1980s) DARPA ALV (CMU and SRI) 9 Cameras (mid 1980s to late 1990s) CMU’s Navlab in 1997 VaMoRs, VaMP, Vita-2 (Bundewehr University, Munich; 1995) Navlab 5 Navlab 1 Ernst Dickmanns Navlab 2 10 2D Scanning Laser Rangefinder (mid 1990s) SICK intended laser for safety Marshall et al.: Autonomous Underground Tramming for Center-Articulated Vehicles • 401 classic combination of Pioneer mobile robot and SICK laser rangefinder Figure 1. 10-t-capacity Atlas Copco ST1010c LHD, with sensor layout. obvious motivations to seek an autonomous tram- erating mines. An infrastructureless system is clearly ming (a.k.a. autotramming) solution. Yet several preferable but must easily allow for new routes to be factors make infrastructureless autotramming a added as the mine advances. challenging task. One aspect is the large inertia Some have argued strongly against the feasibility and characteristic center-articulated/hydraulically of approaches requiring precise pose estimation for actuated steering mechanism, which makes these mining vehicle automation, suggesting instead that vehicles difficult to control at high speeds (unlike “reactive” methods are preferable (Roberts, Duff, & laboratory robots that often behave approximately Corke, 2002, p. 131). However, in this paper we de- as kinematic systems). In this paper, we describe a scribe a robust localization method that draws on control architecture that effectively handles these current techniques from mobile robotics. Our tech- substantive dynamics. nique fuses laser range finder data and data from Moreover, a most notable challenge is the prob- odometric sensors to determine the position and ori- lem of precise and real-time underground local- entation of the vehicle with respect to a sequence of ization. A navigation system that requires the self-generated local metric maps of the underground installation and registration of fixed infrastructure for environment. Our approach contrasts existing sys- Rao-Blackwellized particle filterlocalization for would be costly and susceptible to the tems that either require infrastructure or often em- simultaneous localization and mappingoften harsh environmental conditions found in op- ploy topological methods and/or reactive algorithms (courtesy Cyrill Stachniss) mining automation (2006) 11 Global Positioning System (2000) US Military ended selective availability in 2000, suddenly making it accurate enough for vehicle positioning GPS constellation 12 Mars Exploration Rovers (2004) Actual tracks showing autonomous avoidance of rocks on Mars 13 1st DARPA Grand Challenge (2004) Grand Challenge Map CMU’s Sandstorm Sandstorm has a setback before the race 14 2nd DARPA Grand Challenge (2005) Beer Bottle Pass, on the 2nd Grand Challenge Stanford’s Stanley 15 DARPA Urban Grand Challenge (2007) MIT’s TALOS CMU’s Boss Stanford’s Junior 16 3D Scanning Laser Rangefinder (2007) 17 Look Ma, No Lasers! Source: Mercedes Benz 18 Q: How did mobile robots get to here? A: Has a lot to do with sensors. 19 Common Sensors stereo camera RGB-D camera 3D laser global inertial flash lidar rangefinder positioning measurement system unit omnidirectional compass 2D laser monocular camera inclinometer rangefinder camera encoder infrared sun sensor sonar switch rangefinder radar (also star tracker) 20 State-of-the-Art Mapping Zlot and Bosse, CSIRO, 2011 21 Autonomous Space Robotics Lab (ASRL) safety and security mining planetary rovers 22 Stereo Visual Odometry Pipeline Devon Island 2008 Devon Island 2008 Previous frame Right Image Keypoint de-warp and image detection rectification Nonlinear Stereo Keypoint Outlier Pose numerical matching tracking rejection estimate Left Image Keypoint solution de-warp and image rectification detection • Moravec (1980), Matthies (1987) and extended by many others 23 Teach Phase manual teach mobile robot relative map vehicle vehicle (reference) commands wheeled state chassis (the plant) motion images stereo camera estimate + (sensor) relative mapper Mistastin 2011 24 Repeat Phase autonomous repeat mobile robot relative map vehicle vehicle (reference) path tracker + commands wheeled state safety checks chassis (controller) (the plant) pose relative to path path localizer images stereo camera (state (sensor) estimator) Furgale P T and Barfoot T D. “Visual Path Following on a Manifold in Unstructured Three-Dimensional Terrain”, ICRA 2010 Kuka Service Robotics Best Paper Award Furgale P T and Barfoot T D. “Visual Teach and Repeat for Long-Range Rover Autonomy”, JFR 2010 25 Devon Island 2009 Visual Route Following (UofT, Aerospace) Montreal 2012 26 Visual Route Following (UofT, Aerospace) UTIAS 2014 Tim Barfoot [email protected] http://asrl.utias.utoronto.ca 27 Important Dates Paper Submission: Feb 13, 2015 Accept. Notification: Mar 20, 2015 Revised Papers Due: Apr 17, 2015 th Organizers 10 Conference on Field and Service Robotics (FSR) June 24-26, 2015, Toronto, Canada General Chair: Tim Barfoot (UofToronto) Program Chair: David Wettergreen (CMU) FirstCall Callfor Papersfor Papers Local Arrange.: Jon Kelly (UofToronto) We invite papers describing original work in a number of technical areas related to field and service robotics including (but not limited to): Book and Journal Special Issue Mining, Agriculture and Forestry Robotics Intelligent Cars A book version of the Conference Construction Robots Cleaning, Floor and Lawn Care Robotics Proceedings will be published by Springer- Search and Rescue Robotics Security Robotics Verlag in their STAR series and selected Autonomous Vehicles for Land, Air and Sea Medical and Healthcare Robotics field robotics papers will be considered for Planetary Exploration Robots Entertainment Robotics publication in a Special Issue of the Journal Environmental Robotics Robots for Cultural Heritage of Field Robotics. Sponsors ! fsr.utias.utoronto.ca!.
Recommended publications
  • Result Build a Self-Driving Network
    SELF-DRIVING NETWORKS “LOOK, MA: NO HANDS” Kireeti Kompella CTO, Engineering Juniper Networks ITU FG Net2030 Geneva, Oct 2019 © 2019 Juniper Networks 1 Juniper Public VISION © 2019 Juniper Networks Juniper Public THE DARPA GRAND CHALLENGE BUILD A FULLY AUTONOMOUS GROUND VEHICLE IMPACT • Programmers, not drivers • No cops, lawyers, witnesses GOAL • Quadruple highway capacity Drive a pre-defined 240km course in the Mojave Desert along freeway I-15 • Glitches, insurance? • Ethical Self-Driving Cars? PRIZE POSSIBILITIES $1 Million RESULT 2004: Fail (best was less than 12km!) 2005: 5/23 completed it 2007: “URBAN CHALLENGE” Drive a 96km urban course following traffic regulations & dealing with other cars 6 cars completed this © 2019 Juniper Networks 3 Juniper Public GRAND RESULT: THE SELF-DRIVING CAR: (2009, 2014) No steering wheel, no pedals— a completely autonomous car Not just an incremental improvement This is a DISRUPTIVE change in automotive technology! © 2019 Juniper Networks 4 Juniper Public THE NETWORKING GRAND CHALLENGE BUILD A SELF-DRIVING NETWORK IMPACT • New skill sets required GOAL • New focus • BGP/IGP policies AI policy Self-Discover—Self-Configure—Self-Monitor—Self-Correct—Auto-Detect • Service config service design Customers—Auto-Provision—Self-Diagnose—Self-Optimize—Self-Report • Reactive proactive • Firewall rules anomaly detection RESULT Free up people to work at a higher-level: new service design and “mash-ups” POSSIBILITIES Agile, even anticipatory service creation Fast, intelligent response to security breaches CHALLENGE
    [Show full text]
  • An Off-Road Autonomous Vehicle for DARPA's Grand Challenge
    2005 Journal of Field Robotics Special Issue on the DARPA Grand Challenge MITRE Meteor: An Off-Road Autonomous Vehicle for DARPA’s Grand Challenge Robert Grabowski, Richard Weatherly, Robert Bolling, David Seidel, Michael Shadid, and Ann Jones. The MITRE Corporation 7525 Colshire Drive McLean VA, 22102 [email protected] Abstract The MITRE Meteor team fielded an autonomous vehicle that competed in DARPA’s 2005 Grand Challenge race. This paper describes the team’s approach to building its robotic vehicle, the vehicle and components that let the vehicle see and act, and the computer software that made the vehicle autonomous. It presents how the team prepared for the race and how their vehicle performed. 1 Introduction In 2004, the Defense Advanced Research Projects Agency (DARPA) challenged developers of autonomous ground vehicles to build machines that could complete a 132-mile, off-road course. Figure 1: The MITRE Meteor starting the finals of the 2005 DARPA Grand Challenge. 2005 Journal of Field Robotics Special Issue on the DARPA Grand Challenge 195 teams applied – only 23 qualified to compete. Qualification included demonstrations to DARPA and a ten-day National Qualifying Event (NQE) in California. The race took place on October 8 and 9, 2005 in the Mojave Desert over a course containing gravel roads, dirt paths, switchbacks, open desert, dry lakebeds, mountain passes, and tunnels. The MITRE Corporation decided to compete in the Grand Challenge in September 2004 by sponsoring the Meteor team. They believed that MITRE’s work programs and military sponsors would benefit from an understanding of the technologies that contribute to the DARPA Grand Challenge.
    [Show full text]
  • A Survey of Autonomous Driving: Common Practices and Emerging Technologies
    Accepted March 22, 2020 Digital Object Identifier 10.1109/ACCESS.2020.2983149 A Survey of Autonomous Driving: Common Practices and Emerging Technologies EKIM YURTSEVER1, (Member, IEEE), JACOB LAMBERT 1, ALEXANDER CARBALLO 1, (Member, IEEE), AND KAZUYA TAKEDA 1, 2, (Senior Member, IEEE) 1Nagoya University, Furo-cho, Nagoya, 464-8603, Japan 2Tier4 Inc. Nagoya, Japan Corresponding author: Ekim Yurtsever (e-mail: [email protected]). ABSTRACT Automated driving systems (ADSs) promise a safe, comfortable and efficient driving experience. However, fatalities involving vehicles equipped with ADSs are on the rise. The full potential of ADSs cannot be realized unless the robustness of state-of-the-art is improved further. This paper discusses unsolved problems and surveys the technical aspect of automated driving. Studies regarding present challenges, high- level system architectures, emerging methodologies and core functions including localization, mapping, perception, planning, and human machine interfaces, were thoroughly reviewed. Furthermore, many state- of-the-art algorithms were implemented and compared on our own platform in a real-world driving setting. The paper concludes with an overview of available datasets and tools for ADS development. INDEX TERMS Autonomous Vehicles, Control, Robotics, Automation, Intelligent Vehicles, Intelligent Transportation Systems I. INTRODUCTION necessary here. CCORDING to a recent technical report by the Eureka Project PROMETHEUS [11] was carried out in A National Highway Traffic Safety Administration Europe between 1987-1995, and it was one of the earliest (NHTSA), 94% of road accidents are caused by human major automated driving studies. The project led to the errors [1]. Against this backdrop, Automated Driving Sys- development of VITA II by Daimler-Benz, which succeeded tems (ADSs) are being developed with the promise of in automatically driving on highways [12].
    [Show full text]
  • AN ADVANCED VISION SYSTEM for GROUND VEHICLES Ernst
    AN ADVANCED VISION SYSTEM FOR GROUND VEHICLES Ernst Dieter Dickmanns UniBw Munich, LRT, Institut fuer Systemdynamik und Flugmechanik D-85577 Neubiberg, Germany ABSTRACT where is it relative to me and how does it move? (VT2) 3. What is the likely future motion and (for subjects) intention of the object/subject tracked? (VT3). These ‘Expectation-based, Multi-focal, Saccadic’ (EMS) vision vision tasks have to be solved by different methods and on has been developed over the last six years based on the 4- different time scales in order to be efficient. Also the D approach to dynamic vision. It is conceived around a fields of view required for answering the first two ‘Multi-focal, active / reactive Vehicle Eye’ (MarVEye) questions are quite different. Question 3 may be answered with active gaze control for a set of three to four more efficiently by building on the results of many conventional TV-cameras mounted fix relative to each specific processes answering question 2, than by resorting other on a pointing platform. This arrangement allows to image data directly. both a wide simultaneous field of view (> ~ 100°) with a Vision systems for a wider spectrum of tasks in central region of overlap for stereo interpretation and high ground vehicle guidance have been addresses by a few resolution in a central ‘foveal’ field of view from one or groups only. The Robotics Institute of Carnegie Mellon two tele-cameras. Perceptual and behavioral capabilities University (CMU) [1-6] and DaimlerChrysler Research in are now explicitly represented in the system for improved Stuttgart/Ulm [7-12] (together with several university flexibility and growth potential.
    [Show full text]
  • A Hierarchical Control System for Autonomous Driving Towards Urban Challenges
    applied sciences Article A Hierarchical Control System for Autonomous Driving towards Urban Challenges Nam Dinh Van , Muhammad Sualeh , Dohyeong Kim and Gon-Woo Kim *,† Intelligent Robotics Laboratory, Department of Control and Robot Engineering, Chungbuk National University, Cheongju-si 28644, Korea; [email protected] (N.D.V.); [email protected] (M.S.); [email protected] (D.K.) * Correspondence: [email protected] † Current Address: Chungdae-ro 1, Seowon-Gu, Cheongju, Chungbuk 28644, Korea. Received: 23 April 2020; Accepted: 18 May 2020; Published: 20 May 2020 Abstract: In recent years, the self-driving car technologies have been developed with many successful stories in both academia and industry. The challenge for autonomous vehicles is the requirement of operating accurately and robustly in the urban environment. This paper focuses on how to efficiently solve the hierarchical control system of a self-driving car into practice. This technique is composed of decision making, local path planning and control. An ego vehicle is navigated by global path planning with the aid of a High Definition map. Firstly, we propose the decision making for motion planning by applying a two-stage Finite State Machine to manipulate mission planning and control states. Furthermore, we implement a real-time hybrid A* algorithm with an occupancy grid map to find an efficient route for obstacle avoidance. Secondly, the local path planning is conducted to generate a safe and comfortable trajectory in unstructured scenarios. Herein, we solve an optimization problem with nonlinear constraints to optimize the sum of jerks for a smooth drive. In addition, controllers are designed by using the pure pursuit algorithm and the scheduled feedforward PI controller for lateral and longitudinal direction, respectively.
    [Show full text]
  • An Introduction of Autonomous Vehicles and a Brief Survey
    Journal of Critical Reviews ISSN- 2394-5125 Vol 7, Issue 13, 2020 AN INTRODUCTION OF AUTONOMOUS VEHICLES AND A BRIEF SURVEY 1Tirumalapudi Raviteja, 2Rajay Vedaraj I.S 1Research Scholar, School of Mechanical Engineering, Vellore instutite of technology, Tamilnadu, India, Email: [email protected] . 2Professor, School of Mechanical Engineering, Vellore instutite of technology, Tamilnadu, India, Email: [email protected] Received: 09.04.2020 Revised: 10.05.2020 Accepted: 06.06.2020 Abstract: An autonomous car is also called a self-driving car or driverless car or robotic car. Whatever the name but the aim of the technology is the same. Down the memory line, autonomous vehicle technology experiments started in 1920 only and controlled by radio technology. Later on, trails began in 1950. From the past few years, updating automation technology day by day and using all aspects of using in regular human life. The present scenario of human beings is addicted to automation and robotics technology using like agriculture, medical, transportation, automobile and manufacturing industries, IT sector, etc. For the last ten years, the automobile industry came forward to researching autonomous vehicle technology (Waymo Google, Uber, Tesla, Renault, Toyota, Audi, Volvo, Mercedes-Benz, General Motors, Nissan, Bosch, and Continental's autonomous vehicle, etc.). Level-3 Autonomous cars came out in 2020. Everyday autonomous vehicle technology researchers are solving challenges. In the future, without human help, robots will manufacture autonomous cars using IoT technology based on customer requirements and prefer these vehicles are very safe and comfortable in transportation systems like human traveling or cargo. Autonomous vehicles need data and updating continuously, so in this case, IoT and Artificial intelligence help to share the information device to the device.
    [Show full text]
  • Self-Driving Car Using Artificial Intelligence Prof
    International Journal of Future Generation Communication and Networking Vol. 13, No. 3s, (2020), pp. 612–615 Self-Driving Car Using Artificial Intelligence Prof. S.R. Wategaonkar #1, Sujoy Bhattacharya*2, Shubham Bhitre*3 Prem Kumar Singh *4, Khushboo Bedse *5 #Professor and Project Faculty member, Department of Electronic and Telecommunication, Bharati Vidyapeeth’s College of Engineering, Navi Mumbai, Maharashtra, India. *Project Student, Department of Electronic and Telecommunication, Bharati Vidyapeeth’s College of Engineering, Navi Mumbai, Maharashtra, India. [email protected] [email protected] [email protected] [email protected] [email protected] Abstract Self-driving cars have the prospective to transform urban mobility by providing safe, convenient and congestion free transportability. This autonomous vehicle as an application of Artificial Intelligence (AI) have several difficulties like traffic light detection, lane end detection, pedestrian, signs etc. This problem can be overcome by using technologies such as Machine Learning (ML), Deep Learning (DL) and Image Processing. In this paper, author’s propose deep neural network for lane and traffic light detection. The model is trained and assessed using the dataset, which contains the front view image frames and the steering angle data captured when driving on the road. Keeping cars inside the lane is a very important feature of self-driving cars. It learns how to keep the vehicle in lane from human driving data. The paper presents Artificial Intelligence based autonomous navigation and obstacle avoidance of self-driving cars, applied with Deep Learning to a simulated cars in an urban environment. Keywords—Self-Driving car, Autonomous driving, Object Detection, Lane End Detection, Deep Learning, Traffic light Detection.
    [Show full text]
  • Stanley: the Robot That Won the DARPA Grand Challenge
    Stanley: The Robot that Won the DARPA Grand Challenge ••••••••••••••••• •••••••••••••• Sebastian Thrun, Mike Montemerlo, Hendrik Dahlkamp, David Stavens, Andrei Aron, James Diebel, Philip Fong, John Gale, Morgan Halpenny, Gabriel Hoffmann, Kenny Lau, Celia Oakley, Mark Palatucci, Vaughan Pratt, and Pascal Stang Stanford Artificial Intelligence Laboratory Stanford University Stanford, California 94305 Sven Strohband, Cedric Dupont, Lars-Erik Jendrossek, Christian Koelen, Charles Markey, Carlo Rummel, Joe van Niekerk, Eric Jensen, and Philippe Alessandrini Volkswagen of America, Inc. Electronics Research Laboratory 4009 Miranda Avenue, Suite 100 Palo Alto, California 94304 Gary Bradski, Bob Davies, Scott Ettinger, Adrian Kaehler, and Ara Nefian Intel Research 2200 Mission College Boulevard Santa Clara, California 95052 Pamela Mahoney Mohr Davidow Ventures 3000 Sand Hill Road, Bldg. 3, Suite 290 Menlo Park, California 94025 Received 13 April 2006; accepted 27 June 2006 Journal of Field Robotics 23(9), 661–692 (2006) © 2006 Wiley Periodicals, Inc. Published online in Wiley InterScience (www.interscience.wiley.com). • DOI: 10.1002/rob.20147 662 • Journal of Field Robotics—2006 This article describes the robot Stanley, which won the 2005 DARPA Grand Challenge. Stanley was developed for high-speed desert driving without manual intervention. The robot’s software system relied predominately on state-of-the-art artificial intelligence technologies, such as machine learning and probabilistic reasoning. This paper describes the major components of this architecture, and discusses the results of the Grand Chal- lenge race. © 2006 Wiley Periodicals, Inc. 1. INTRODUCTION sult of an intense development effort led by Stanford University, and involving experts from Volkswagen The Grand Challenge was launched by the Defense of America, Mohr Davidow Ventures, Intel Research, ͑ ͒ Advanced Research Projects Agency DARPA in and a number of other entities.
    [Show full text]
  • Introducing Driverless Cars to UK Roads
    Introducing Driverless Cars to UK Roads WORK PACKAGE 5.1 Deliverable D1 Understanding the Socioeconomic Adoption Scenarios for Autonomous Vehicles: A Literature Review Ben Clark Graham Parkhurst Miriam Ricci June 2016 Preferred Citation: Clark, B., Parkhurst, G. and Ricci, M. (2016) Understanding the Socioeconomic Adoption Scenarios for Autonomous Vehicles: A Literature Review. Project Report. University of the West of England, Bristol. Available from: http://eprints.uwe.ac.uk/29134 Centre for Transport & Society Department of Geography and Environmental Management University of the West of England Bristol BS16 1QY UK Email enquiries to [email protected] VENTURER: Introducing driverless cars to UK roads Contents 1 INTRODUCTION .............................................................................................................................................. 2 2 A HISTORY OF AUTONOMOUS VEHICLES ................................................................................................ 2 3 THEORETICAL PERSPECTIVES ON THE ADOPTION OF AVS ............................................................... 4 3.1 THE MULTI-LEVEL PERSPECTIVE AND SOCIO-TECHNICAL TRANSITIONS ............................................................ 4 3.2 THE TECHNOLOGY ACCEPTANCE MODEL ........................................................................................................ 8 3.3 SUMMARY ...................................................................................................................................................
    [Show full text]
  • Safety of Autonomous Vehicles
    Hindawi Journal of Advanced Transportation Volume 2020, Article ID 8867757, 13 pages https://doi.org/10.1155/2020/8867757 Review Article Safety of Autonomous Vehicles Jun Wang,1 Li Zhang,1 Yanjun Huang,2 and Jian Zhao 2 1Department of Civil and Environmental Engineering, Mississippi State University, Starkville, MS 39762, USA 2Department of Mechanical and Mechatronics Engineering, University of Waterloo, 200 University Avenue West, Waterloo, ON N2L 3G1, Canada Correspondence should be addressed to Jian Zhao; [email protected] Received 1 June 2020; Revised 3 August 2020; Accepted 1 September 2020; Published 6 October 2020 Academic Editor: Francesco Bella Copyright © 2020 Jun Wang et al. *is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Autonomous vehicle (AV) is regarded as the ultimate solution to future automotive engineering; however, safety still remains the key challenge for the development and commercialization of the AVs. *erefore, a comprehensive understanding of the de- velopment status of AVs and reported accidents is becoming urgent. In this article, the levels of automation are reviewed according to the role of the automated system in the autonomous driving process, which will affect the frequency of the disengagements and accidents when driving in autonomous modes. Additionally, the public on-road AV accident reports are statistically analyzed. *e results show that over 3.7 million miles have been tested for AVs by various manufacturers from 2014 to 2018. *e AVs are frequently taken over by drivers if they deem necessary, and the disengagement frequency varies significantly from 2 ×10−4 to 3 disengagements per mile for different manufacturers.
    [Show full text]
  • Aptiv Application to Test Automated Driving Systems (ADS)
    Charles D. Baker. Governor Karyn E Polito. lJeutenant Governor massDOT Stephanie Pollack. MassDOT Secretary & CEO Massachusetts Department of Transportation Application to Test Automated Driving Systems on Public Ways in Massachusetts CONTACT INFORMA T/ON: Name of Organization: Aptiv Services US LLC and its affiliate nuTonomy, Inc. Primary Contact Person: Abe Ghabra Title: Managing Director, Las Vegas Technical Center Street Address of Company's Headquarters Office: 100 Northern Ave. Suite 200. City, Town of Headquarters Office: Boston State: MA Zip Code: 02210 Country: USA Website: https:ll www.aptlv.com/autonomous-mobllfty CERT/FICA TION: The Applicant certifies that all information contained within this application is true, accurate and complete to the best ofIts knowledge. /i~L /t:h-- Abe Ghabra 01 January 2020 Srgnature ofApplicant's Representative Printed Name Date of Signing Managing Director, Las Vegas Technical Center Position and Title Ten Park Plaza, Suite 4160, Boston, MA 0211 6 Tel. 857-368-4636, TTY: 857-368-0655 www.rnass.gov/massdot Application to Test Automated Driving Systems on Public Ways in Massachusetts Detailed Information Detail # 1: Experience with Automated Driving Systems (ADS) Detail # 2: Operational Design Domain Detail # 3: Summary of Training and Operations Protocol Detail # 4: First Responders Interaction Plan Detail # 5: Applicant’s Voluntary Safety Self-Assessment Detail # 6: Motor Vehicles in Testing Program Detail # 7: Drivers in Testing Program Detail # 8: Insurance Requirements Detail # 9: Additional Questions Note: Applicants should not disclose any confidential information or other material considered to be trade secrets, as the applications are considered to be public records. The Massachusetts Public Records Law applies to records created by or in the custody of a state or local agency, board or other government entity.
    [Show full text]
  • Towards a Viable Autonomous Driving Research Platform
    Towards a Viable Autonomous Driving Research Platform Junqing Wei, Jarrod M. Snider, Junsung Kim, John M. Dolan, Raj Rajkumar and Bakhtiar Litkouhi Abstract— We present an autonomous driving research vehi- cle with minimal appearance modifications that is capable of a wide range of autonomous and intelligent behaviors, including smooth and comfortable trajectory generation and following; lane keeping and lane changing; intersection handling with or without V2I and V2V; and pedestrian, bicyclist, and workzone detection. Safety and reliability features include a fault-tolerant computing system; smooth and intuitive autonomous-manual switching; and the ability to fully disengage and power down the drive-by-wire and computing system upon E-stop. The vehicle has been tested extensively on both a closed test field and public roads. I. INTRODUCTION Fig. 1: The CMU autonomous vehicle research platform in Imagining autonomous passenger cars in mass production road test has been difficult for many years. Reliability, safety, cost, appearance and social acceptance are only a few of the legitimate concerns. Advances in state-of-the-art software with simple driving scenarios, including distance keeping, and sensing have afforded great improvements in reliability lane changing and intersection handling [14], [6], [3], [12]. and safe operation of autonomous vehicles in real-world The NAVLAB project at Carnegie Mellon University conditions. As autonomous driving technologies make the (CMU) has built a series of experimental platforms since transition from laboratories to the real world, so must the the 1990s which are able to run autonomously on freeways vehicle platforms used to test and develop them. The next [13], but they can only drive within a single lane.
    [Show full text]