Computing Systems for Autonomous Driving: State-Of-The-Art And
Total Page:16
File Type:pdf, Size:1020Kb
1 Computing Systems for Autonomous Driving: State-of-the-Art and Challenges Liangkai Liu∗, Sidi Lu∗, Ren Zhong∗, Baofu Wu∗, Yongtao Yao∗, Qingyang Zhangy, Weisong Shi∗ ∗Department of Computer Science, Wayne State University, Detroit, MI, USA, 48202 ySchool of Computer Science and Technology, Anhui University, Hefei, China, 230601 {liangkai, lu.sidi, ren.zhong, baofu.wu, yongtaoyao, weisong}@wayne.edu, [email protected] Abstract— The recent proliferation of computing technologies autonomous vehicle’s computing systems are defined to cover (e.g., sensors, computer vision, machine learning, and hardware everything (excluding the vehicle’s mechanical parts), in- acceleration), and the broad deployment of communication mech- cluding sensors, computation, communication, storage, power anisms (e.g., DSRC, C-V2X, 5G) have pushed the horizon of autonomous driving, which automates the decision and control management, and full-stack software. Plenty of algorithms of vehicles by leveraging the perception results based on multiple and systems are designed to process sensor data and make sensors. The key to the success of these autonomous systems a reliable decision in real-time. is making a reliable decision in real-time fashion. However, accidents and fatalities caused by early deployed autonomous However, news of fatalities caused by early developed vehicles arise from time to time. The real traffic environment autonomous vehicles (AVs) arises from time to time. Until is too complicated for current autonomous driving computing August 2020, five self-driving car fatalities happened for level- systems to understand and handle. In this paper, we present 2 autonomous driving: four of them from Tesla and one from state-of-the-art computing systems for autonomous driving, in- Uber [19]. Table I summarizes the date, place, company, cluding seven performance metrics and nine key technologies, followed by twelve challenges to realize autonomous driving. We and reasons for these five fatalities. The first two fatalities hope this paper will gain attention from both the computing attributed to Tesla happened in 2016 with the first accident and automotive communities and inspire more research in this occurring because neither the Autopilot system nor the driver direction. failed to recognize the truck under thick haze. The vehicle in the second incident mistook the truck for open space. In the 2018 incident involving Tesla, the autopilot failed to recognize I. INTRODUCTION the highway divider and crashed into it. The most recent Recently, with the vast improvements in computing tech- fatality from Tesla happened in 2019 because the vehicle failed nologies, e.g., sensors, computer vision, machine learning, to recognize a semi-trailer. The fatality from Uber happened hardware acceleration, and the wide deployment of commu- because the autonomous driving system failed to recognize nication mechanisms, e.g., Dedicated short-range communica- that pedestrians jaywalk. tions (DSRC), Cellular Vehicle-to-Everything (C-V2X), and 5G, autonomous driving techniques have attracted massive In summary, all four incidents associated with Tesla are due attention from both the academic and automotive communi- to perception failure, while Uber’s incident happened because ties [1], [2]. According to [3], the global autonomous driving of the failure to predict human behavior. Another fact to pay market expects to grow up to $173.15B by 2030. Many au- attention to is that currently, the field-testing of level 2 au- tomotive companies have made enormous investments in this tonomous driving vehicles mostly happens in places with good weather and light traffic conditions like Arizona and Florida. arXiv:2009.14349v3 [cs.DC] 7 Dec 2020 domain, including ArgoAI, Audi, Baidu, Cruise, Mercedes- Benz, Tesla, Uber, and Waymo, to name a few [4], [5], [6], The real traffic environment is too complicated for the current [7]. Several fleets of SAE (Society of Automotive Engineers) autonomous driving systems to understand and handle easily. L4 vehicles have been deployed around the world [8], [9]. The objectives of level 4 and level 5 autonomous driving To achieve autonomous driving, determining how to make require colossal improvement of the computing systems for the vehicle understand the environment correctly and make autonomous vehicles. safe controls in real-time is the essential task. Rich sensors This paper presents state-of-the-art computing systems for including camera, LiDAR (Light Detection and Ranging), autonomous driving, including seven performance metrics and Radar, Inertial Measurement Unit (IMU), Global Navigation nine key technologies, followed by eleven challenges and Satellite System (GNSS), and Sonar, as well as powerful opportunities to realize autonomous driving. The remaining computation devices, are installed on the vehicle [10], [11], parts of this paper are organized as follows: Section II dis- [12], [13], [14]. This design makes autonomous driving a real cusses the reference architecture of the computing systems powerful "computer on wheels." In addition to hardware, the for autonomous driving. In Section III, we show some metrics rapid development of deep learning algorithms in object/lane used in the evaluation of the computing system. Section IV dis- detection, simultaneous localization and mapping (SLAM), cusses the key technologies for autonomous driving. Section V and vehicle control also promotes the real deployment and presents possible challenges. Finally, this paper concludes in prototyping of autonomous vehicles [15], [16], [17], [18]. The Section VI. 2 TABLE I LIST OF FATALITIES CAUSED BY LEVEL 2 AUTONOMOUS DRIVING VEHICLES. Date Place Company Reason 20 Jan. 2016 Handan, Hebei China Tesla fail to recognize truck under a thick haze 07 May 2016 Williston, Florida USA Tesla mistook the truck for open sky 18 Mar. 2018 Tempe, Arizona USA Uber fail to recognize pedestrians jaywalk at night 23 Mar. 2018 Mountain View, California USA Tesla fail to recognize the highway divider 1 Mar. 2019 Delray Beach, Florida USA Tesla fail to recognize semi-trailer TABLE II Figure 1 shows a representative reference architecture of the END-TO-END APPROACHES FOR AUTONOMOUS DRIVING. computing system on autonomous vehicles. Generally, the computing system for autonomous driving vehicles can be Work Methods Characteristics divided into computation, communication, storage, security raw image to steering angles for off-road [24] supervised DNN and privacy, and power management. Each part covers four obstacle avoidance on mobile robots map an input image to a small number of layers with sensors, operating system (OS), middleware, and [25] supervised DNN key perception indicators applications. The following paragraphs will discuss the corre- CNN to map raw pixels from a camera [26] supervised DNN sponding components. directly to steering commands FCN-LSTM network to predict multi-modal For safety, one of the essential tasks is to enable the “com- [27] supervised DNN discrete and continuous driving behaviors puter” to understand the road environment and send correct automated driving framework in simulator [28] DQN control messages to the vehicle. The whole pipeline starts environment lane following in a countryside road without with the sensors. Plenty of sensors can be found on an au- [29] DQN traffic using a monocular image as input tonomous driving vehicle: camera, LiDAR, radar, GPS/GNSS, ultrasonic, inertial measurement unit (IMU), etc. These sensors capture real-time environment information for the computing II. REFERENCE ARCHITECTURE system, like the eyes of human beings. Operating system As an essential part of the whole autonomous driving (OS) plays a vital role between hardware devices (sensors, vehicle, the computing system plays a significant role in the computation, communication) and applications. Within the OS, whole pipeline of driving autonomously. There are two types drivers are bridges between the software and hardware devices; of designs for computing systems on autonomous vehicles: the network module provides the abstraction communication modular-based and end-to-end based. interface; the scheduler manages the competition to all the Modular design decouples the localization, perception, con- resources; the file system provides the abstraction to all the trol, etc. as separate modules and make it possible for people resources. For safety-critical scenarios, the operating system with different backgrounds to work together [20]. The DARPA must satisfy real-time requirements. challenges is a milestone for the prototyping of autonomous As the middle layer between applications and operating sys- driving vehicles, including Boss from CMU [21], Junior from tems [31], middleware provides usability and programmability Stanford [22], TerraMax and BRAiVE from University of to develop and improve systems more effectively. Generally, Parma [23], etc. Their designs are all based on modules middleware supports publish/subscriber, remote procedure call including perception, mission planning, motion planning, and (RPC) or service, time synchronization, and multi-sensor col- vehicle controls. Similarly, the survey fleet vehicles developed laboration. A typical example of the middleware system is the by Google and Uber are also modular-based [4], [7]. The main Robot Operating System (ROS) [32]. On top of the operating differences for these AV prototypes are the software and the system and middleware system, several applications, including configuration of sensors like camera, LiDAR,