Autonomous Vehicles

Industry Outlook and Investment Opportunities

Introduction

Fully autonomous, self-driving may be just years away. The Society of Automotive Engineers determines the intelligence level and automation capabilities of vehicles, ranking them 0 through 5. Most autonomous vehicles are Level 2, including the entirety of Tesla’s product line. The only Level 3 vehicle currently available to consumers is the Audi A8, requiring the presence of a human driver but with the capacity to maneuver independently on major roads. Level 4 autonomous vehicles, capable of navigation without human intervention, are of great interest to manufacturers.

Background

Auto manufacturers are racing to reach the Level 4 benchmark. This year, carmakers like Honda are planning to release Level 3 vehicles and are shifting their focus towards the development of Level 4 vehicles. Honda plans to release the new generation of autonomous vehicles by 2024. Companies like Waymo, Yandex, and Baidu are not original equipment manufacturers; they commercially retrofit cars and refine technology developed by OEMs such as Qualcomm and Ford. While projections toward the beginning of the decade following the breakthroughs afforded by the advent of complex, GPU-enabled neuro-nets projected Level 4 autonomous vehicles becoming available by 2020, it is expected that some US-based companies will achieve this by 2021.

Highlights:

Level 3 vehicles, which require human operation but do not need “eyes on the road,” are currently authorized in Japan and South Korea, and have begun to launch internationally:

• The Audi A8’s AI Traffic Jam Pilot system is awaiting regulatory approval in the US. • The Honda Legend is set to be released in 2020 in Japan. • Tesla, Hyundai, Kia, BMW, and Mercedes-Benz each have Level 3 vehicles slated for release in 2021. • 332,932 autonomous vehicle units were produced in 2019, up from 137,129 in 2018, according to Gartner.

Estimates of industry growth are based primarily on expected production and widespread release of Level 3 and 4 vehicles: • In the near term, autonomous vehicle production is expected to reach more than 740,000 by 2023, and IHS Markit projects that by 2040, 33 million autonomous vehicles will be sold globally. • The autonomous vehicle market will grow from $54.23 billion to $556.67 billion between 2019 and 2026, according to Allied Market Research. • Commercial vehicle sales will comprise as much as 75% of the autonomous vehicle market by 2035, according to Navigant Consulting.

2 Neural Nets and Scanning Technology

Training neural networks for a task is difficult and expensive. The network effects of data collection via already deployed vehicles is a major boon for companies like Tesla which use deep neural networks that run simultaneously inside the , processing vast amounts of sensor data to perceive the environment and make safe decisions.

Advanced Driver Assistance Systems (ADAS) ADAS are designed to enable vehicles to operate autonomously. Examples of ADAS technology include , lane-keeping, self-parking, and collision avoidance. Neural nets use sensor technology to recognize objects during navigation, and the appropriate suite of sensor technologies is a subject of debate among industry experts. “Edge cases,” where particular technologies fail, are a serious concern for the industry.

3D Camera Vision 3D Camera Vision is the predominant scanning technology in Level 1 and 2 autonomous vehicles and used to perform object recognition and render a 3D model of the vehicle’s environment. The technology also helps autonomous vehicles locate themselves on maps. However, 3D Camera Vision can have difficulty recognizing objects in rain, snow and fog, and camera-only data is not reliable according to the KITTI benchmark for object recognition, considered the industry standard. The use of Camera Vision neural nets outside of familiar environments also presents challenges. Companies like Tesla rely heavily on Camera Vision, which provides navigation data via the network effect of the images collected by their vehicle fleet. This data is used to improve the network’s algorithms, allowing them to train their neural nets and continuously enhance performance.

3 LiDAR (Light Detection and Ranging) units, a recent innovation in autonomous vehicle navigation, create a 3D model of the environment to direct the car’s steering, acceleration, and braking. Data collected by lasers are processed in a microchip inside of the LiDAR unit before being sent to the car’s central computer. LiDAR technology can add significantly to a vehicle’s price. Velodyne first developed LiDAR technology, dominates the market, and sells units for as much as $80,000 a piece. Other companies such as Ouster, a leading provider of high-resolution lidar sensors, compete on price through reducing the number of microchips in use. Solid State LiDAR is a stationary system that is smaller, cheaper, and more resilient than traditional systems. There is not yet current commercial adoption of Solid State LiDAR, and multiple units need to be installed in autonomous vehicles for it to be effective.

LiDAR vs Camera Vision

LiDAR systems have advantages over camera-based systems in that there is no processing required for the system to gauge distance. The laser measures distance automatically down to the centimeter and registers the results directly into the microchip.

LiDAR critics like Tesla CEO Elon Musk favor the exclusive use of Camera Vision to enable autonomous vehicle navigation for several reasons:

• They consider technologies like LiDAR to merely be a “crutch” for edge case failures that Camera Vision must overcome in the long run, in addition to being prohibitively expensive. • LiDAR’s centimeter-level object recognition may not be as useful as it may seem to be steering and acceleration compared to that of camera based systems.

Other Navigation Technologies: • Radar Sensors have 2D (and potentially 3D) applications which also perform object recognition tasks • M2M technology which is vehicle-to-vehicle communication that enhances navigation accuracy.

Industry Effects & Cost-Benefit Analysis

The introduction of autonomous vehicles represents a significant displacement risk to truckers and other commercial vehicle drivers, with estimates for displacement as high as 3.5 million professional drivers in the US alone. However, among the 2.4 million heavy and tractor-trailer truck drivers, only an estimated 456,000 perform long-haul duties most vulnerable to automation.

• Retraining of drivers of commercial vehicles to safely operate commercial autonomous vehicles is important for mitigating the potential effects displacement will have on national unemployment. Operators of commercial vehicles will likely remain a vital part of the industry, serving the ancillary functions truckers already perform like log-keeping, maintenance, and cargo monitoring. • Ensuring that the traditionally under-skilled workers who are displaced by automation receive opportunities for retraining will be an important ESG issue to trucker unions and customers.

4 • The effect on ancillary services such as truck stops is likely to be marginal or nonexistent even with Level 4 adoption which still requires the presence of human operators. • Some published research such as Socially Responsible Automation, a paper by Pramod Khargonekar, Vice Chancellor at University of California, Irvine, and Meera Sampath, Associate Vice Chancellor at the State University of New York, offer potential frameworks for reconciling the benefits and costs of automation. SRA asks industry leaders to combine cost cutting and efficiency with workforce development and may become a competitive benchmark among manufacturers.

Impact Measurement

The costs of automation to regular consumers is expected to reduce dramatically with scale, with consumers paying a premium of up to $10,000 for an autonomous vehicle by 2025. The reduction in the impact of car crashes on public infrastructure, welfare, and productivity represents a significant benefit to legislative stakeholders. 93% of crashes between 2005 and 2007 were human-caused, according to the National Highway Traffic Safety Administration. With full automation that reduces crash incidence, savings of nearly $280 billion could have been achieved in 2010. Reduction of greenhouse gas emissions is also a central ESG benefit of the adoption of autonomous vehicles, with research showing that GHG reduction due to autonomous vehicle adoption in ridesharing applications could be as high as 90%. However, research on this is not conclusive; an increase in vehicle use due to automation could limit the GHG reduction benefits of autonomous ridesharing applications.

Figure 2. Projected growth in the 3D lidar market between 2019 and 2030. Courtesy of IDTechEx

5 Investment Outlook

Automakers are acquiring startups that can speed research and development. Companies are also forming joint international ventures to increase access to technology and data, which complicates IP boundaries. IP security is vital to protecting sensitive R&D information as companies race to introduce fully autonomous vehicles to the market. Shared ownership of IP between R&D partners can present a significant due diligence challenge for investors in autonomous vehicle companies. Understanding licensing agreements is crucial to knowing when ownership of proprietary data is necessary to the functioning of a company’s navigational system. Investors and autonomous vehicle manufacturers do not face significant regulatory hurdles outside of those which govern data collection and privacy protection principles agreed upon by major manufacturers.

VC & Private Equity Funds: • Trucks Venture Capital • New Enterprise Associates • KKR • Toyota AI Ventures • Sequoia Capital • Silver Lake Capital Emerging Technologies • Ushr • Nuro AI • Pony AI • Hesai • Xpeng Motors • Sense Time • Five AI Public Companies: • Tesla • Alphabet (Waymo) • Amazon (Scout) • GM (Cruise) • • Ford (Argo AI) • Aptiv

6