Table of Contents
Total Page:16
File Type:pdf, Size:1020Kb
Vol. 30, No. 6 July/August 2017 Table of Contents Essential to Autonomous Vehicle Development, 2 Simulation Isn’t Close to Being Ready More Supplier Disruption Coming 6 Zoox Self-Driving Startup to Design, Build, Operate Its 8 Own Urban Taxi Service The Company Profile: Nexteer 11 Movimento Details OTA Update Types 20 The Hansen Report on Automotive Electronics, July/August 2016 www.hansenreport.com© 2017 Paul Hansen Associates, 150 Pinehurst Road, Portsmouth, NH 03801 USA Telephone: 603-431-5859; fax: 603-431-5791; email: [email protected] All rights reserved. Materials may not be reproduced in any form without written permission. ISSN 1046-1105 2 Essential to Autonomous Vehicle Development, Simulation Isn’t Close to Being Ready Despite more than one million road fatalities globally each year, human drivers perform exceedingly well, especially in the United States, where there were just 1.13 fatalities per hundred million miles traveled in 2015. While it has become a widely held belief that autonomous vehicles will soon be safer than vehicles driven by humans, proving that point will not be at all easy. Indeed, an automated vehicle under test would have to drive several hundred million miles on real roads to verify that it is at least as safe as a human-driven vehicle. But that is not a real- world option. One possible remedy is to use simulators to “drive” virtual vehicles millions of miles in virtual worlds, such that a sufficient number of exceptional, accident- causing edge cases can be found and addressed in a fraction of the time real world testing would take. These edge cases are hard to anticipate. They can occur as a combination of random events: a sensor fails just after a rainstorm as the setting sun emerges from behind a black cloud, for example. Simulation is critical to advancing autonomous vehicle development. “It’s very important,” said Erik Coelingh, responsible for developing autonomous driving technical strategy for the Volvo-Autoliv joint venture, Zenuity. “Just driving around hoping to find these extremely exceptional situations just isn’t going to work. You have to use simulation to the largest extent possible.” Carl Squire, of IPG Automotive, a Karlsruhe, Germany-based company that offers a model environment comprising a driver model, vehicle model and flexible models for roads and traffic scenarios, sees two different approaches playing out in the industry. “Most of the startups and the Silicon Valley companies are doing what Google started several years ago—training image recognition software. ... They record a ton of video, annotate it and feed it into their algorithms. Most of the people I call on in the traditional automotive industry, however, view autonomous driving almost as an extension of assisted driving. They are doing a lot more simulation,” he observed. “Over the last six months, I’ve witnessed those two philosophies beginning to merge. We now see companies using simulation to feed their image recognition algorithms.” Mr. Squire is managing director of IPG Automotive USA in Ann Arbor, Michigan. The extent to which simulation is possible today falls far short of a complete solution. I asked Alejandro Vukotich, vice president in charge of technical development of piloted driving at Audi, if simulation suppliers will soon be ready with virtual worlds in which to test virtual autonomous vehicles. “There is still a The Hansen Report on Automotive Electronics, July/August 2017 www.hansenreport.com 3 long way to go for that,” he said. “The simulation solutions we see are done by small, highly dedicated companies that specialize in just one issue.” Introduced in July 2017, Audi’s new A8, the first production automobile designed specifically for automated driving, was developed with the help of simulation tools that were built in-house by the carmaker. “But it was painful,” said Mr. Vukotich. “Simulation exceeds the capability of what a carmaker can do. Carmakers and suppliers need to join forces to develop a standard ecosystem for simulation where all the pieces are plug and play so they work seamlessly together.” Given how small most of the companies tackling simulation are, considerably more funding will be required to develop that robust ecosystem. One supplier that seems fairly far along the path to becoming a simulation system integrator is ANSYS, a computer-aided engineering software developer located near Pittsburgh, Pennsylvania. ANSYS just began working on integrating autonomous driving simulation tools at the end of 2015. Sandeep Sovani directs ANSYS’s global automotive business. “There is an enormous need for simulation, but the level of sophistication required is so great that nobody yet has a complete solution. First, you need a model of the virtual world that includes the movements of all the players such as pedestrians and other vehicles,” explained Mr. Sovani. “Then you need to simulate all the sensors. Those sensors would look at the virtual world much like sensors on a real vehicle sense the environment around it. There needs to be specific models for the camera, radar, lidar and ultrasonic sensors. You also need models for the vehicle control and vehicle dynamics so you can predict the motion of the vehicle, where the vehicle is going in this virtual world. All that needs to run in a closed loop.” Sensor Models Needed The latest versions of computer games like Grand Theft Auto are now capable of producing nearly photo-realistic graphics that suggest the makers of these graphics may well have a role to play in the development of the virtual worlds needed to test autonomous vehicles. Unity Technologies, a leading developer of game engine software, is developing a desktop application to simulate environments for autonomous vehicles. But self-driving vehicle perception is not only a matter of computer vision. “While the graphics in Grand Theft Auto may look like the real world from the perspective of a camera, the output from lidar and especially from radar is not at all easy to simulate, said Zenuity’s Mr. Coelingh. “In practice, radar reflections are very complicated. They bounce off all kinds of things and follow multiple The Hansen Report on Automotive Electronics, July/August 2017 www.hansenreport.com 4 paths. We are not even close to making a simulation of radar that is close to reality.” Martijn Tideman, product director for TASS International, developer of PreScan, a virtual world and simulation tool that models radar, lidar, camera and ultrasonic sensors, disagrees about the applicability of video game graphics to autonomous driving simulation. “There has been a lot of buzz around Grand Theft Auto, but if you feed their video into an automotive camera it won’t understand what it is seeing. You might recognize bright white lines, but the shadows are completely off, the reflections unrealistic. GTA emits only red, green and blue, whereas real automotive cameras see the full spectrum from ultraviolet to infrared. Our PreScan camera model is full spectrum.” While radar models have been part of PreScan since 2007, TASS will soon launch new radar simulation that accurately models radar’s multiple bounces using ray- tracing technology. Computer intensive, TASS’s physics-based model requires a GPU cluster in order to run in an acceptable time frame. The least safety critical sensor, according to Mr. Tideman is ultrasonic. “Ultrasonic is mainly used for parking maneuvers today. Our customers seem to accept our stochastic [randomly determined] model for ultrasonic sensors. But for the other sensors, camera, lidar and radar, our customers require physics-based models, which we supply.” TASS’s sensor models output raw data that can be used as input to develop algorithms, either model in the loop, software in the loop or hardware in the loop, to virtually drive a million kilometers in a weekend. No single company is able to provide everything needed to virtually validate autonomous vehicles. TASS can provide sensor models along with traffic and environmental modeling, but customers must work with many different partners who can provide much of what’s missing, whether it’s scenario modeling, vehicle modeling or algorithm modeling. “We connect to map databases—TomTom, HERE or StreetMap,” said Mr. Tideman. “We connect to all the vehicle dynamic packages such as CarSim, VI- grade or ADAMS from MSC Software. We connect to PTV Vissim to simulate naturalistic traffic behavior in PreScan. They generate virtual traffic around our cars, the car you are trying to test. All the reflection properties for the camera, radar etc., those are ours. The man or woman behind the steering wheel that cuts in when you least expect it, that is from PTV Vissim. To speed up simulation, we work with manufacturers of clusters such as IBM and Nvidia to run our tools.” The Hansen Report on Automotive Electronics, July/August 2017 www.hansenreport.com 5 Neural Networks to Accelerate Testing Celite Milbrandt, a co-founder of Slacker Radio and involved with other startups including Ride Scout, acquired by Daimler in 2014, will soon bring to market an autonomous vehicle simulator called testTrack. His new company, monoDrive was founded in May 2016 and has just eight employees. Mr. Milbrandt claims that his simulator, which includes both static scene components such as buildings, roads and lane markings, and dynamic scene components including cars and pedestrians, will be sufficient to fully exercise the autonomous vehicle’s perception and path planning systems. “If you can drive safely through our simulator with the lidar, radar and camera inputs we give you, you will be able to drive on the road and see the same results,” he asserted. The procedural generation of the dynamic scene portion is where monoDrive excels. “We use neural networks to train driver behavior. There will be cars that speed, tailgate, or slam on the brakes every time they see a need for caution.