TRANSPORTATION RESEARCH BOARD

Accelerating Automated Vehicle Acceptance

July 14, 2020

@NASEMTRB #TRBwebinar PDH Certification The Transportation Research Board has met the standards and Information: requirements of the Registered Continuing Education Providers •2.0 Professional Development Program. Credit earned on completion Hours (PDH) – see follow-up of this program will be reported to email for instructions RCEP. A certificate of completion will •You must attend the entire be issued to participants that have registered and attended the entire webinar to be eligible to receive session. As such, it does not include PDH credits content that may be deemed or •Questions? Contact Reggie construed to be an approval or Gillum at [email protected] endorsement by RCEP.

#TRBwebinar Learning Objectives

1.Identify current AV practices

2.Discuss how data, policies, and trust impact the pace of automated system technologies

#TRBwebinar Are We There Yet? Building on TRB Advancing Automated Vehicle Adoption Workshop

Valerie Shuman Principal, SCG, LLC

https://connectedautomateddriving.eu/event/computers-wheels-whos-going-keep-track-driverless-vehicles/ Overview

• Key Questions • Roundtable Insights Are We There Yet?

• What is an AV anyway & who sets that definition? • Who consistently captures & reports this data for this population (the same way that NHTSA does for the driving public as a whole)? • How do they do this?

• Roundtable Question: What AV metrics can we implement within 12 months (or sooner)?

https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812826 Tracking AV Capability/Performance Trends

• What are the key tasks and metrics (top 10? Top 20)? • How do we regularly review these metrics as an industry to ensure we’re trending in the right direction and report progress?

• Roundtable Question: Propose a set of key driving tasks and metrics that we should be monitoring, and a national level solution for monitoring them What Did We Learn?

• What is an AV? • AV is L3 or 4 and above • Initial focus should be metrics for ADAS and HAV systems (L1/L2)

• What types of metrics? • Focus on efficiency, safety, equity metrics • Look at scenario-based data and outcome metrics to understand status of overall fleet • Develop metrics for each mode • Consider regional requirements like different levels of AV functionality (e.g., rural) • Need to look at crashes and near misses; collect data on what’s working and what’s failing Specific Metrics (1)

• Performance along a “familiar” route • Route performance and adjustments • Takeover time controls in various road conditions and situations • Interaction with local traffic “culture” – is the AV a good citizen? • Consider overtaking distance (especially for bikes) • Organizational Design Domains (ODDs) • How much driving is done in and outside of the ODD? • At L4, testing on all intended ODDs for a given road must be “green” before can use that road • Moving object detection (including speed at which detection is made) • Develop third party testing standards Specific Metrics (2)

• Signal detection analysis for certain crash types in various scenarios • Functional testing • Secondary crashes • Consider contributing factors/context • Develop a list of factors, design scenarios and test to understand crashes/mile • Environmental data • Takeover requests (planned/unplanned) • Post crash what L1/L2 features were turned on (or not)? • Was the driver aware of the feature? • Biometrics of person in the . Is the person in a good state and can reengage? • Vehicle kinematics How Do We Implement?

• Partner with insurance industry, OEMs, hospitals and public & private sector tracking • Carefully consider model to encourage private sector sharing – anything too “regulatory” will be a challenge • Develop nationally consistent/standardized metrics to allow data- sharing and confidence • Beware of unintended consequences from metrics (incentivize proper design targets) In Summary

• There is a lot of nuance to consider • Making choices is going to be tough • Trust is less important than trustworthiness https://www.automatedvehiclessymposium.org/register Trusting Increasingly Autonomous Vehicle Technology

John D. Lee University of Wisconsin—Madison [email protected]

Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46(1), 50–80. Lee, J. D., & Kolodge, K. (2019). Exploring trust in self-driving vehicles with text analysis. Human Factors. doi:10.1177/0018720819872672 Lee, J. D. (2020). Trust in automated, intelligent, and connected vehicles. In D. L. Fisher, W. J. Horrey, J. D. Lee, & M. A. Regan (Eds.), Handbook of Human Factors for Automated, Connected, and Intelligent Vehicles. CRC Press. Chiou, E. K. & Lee, J. D. (in review). Trusting automated agents: Designing for appropriate cooperation. Human Factors.

Policy, Standards, and Societal Infrastructure Transportation as multi-echelon network with many trust relationships Remote Infrastructure and Trac

Remote operator trust in AV • Driver-Automated vehicle trust Negotiated Road Situations

Pedestrian trust in AV • Person-Policymaker trust Engineers trust in sensors

Driving functions and activities

Driver trust in sensor system

Vehicle Sensors and Controls NTSB Highway Accident Report trim from the car was found entangled within the forward-most area of contact damage on the semitrailer. Figure 6 shows a postcrash photograph of the semitrailer, and figure 7 focuses on the damageNTSB to the semitrailer Finds. Overreliance in Tesla Crash

NTSB Highway Accident Report

Figure 6. Damaged right side of the Utility semitrailer.

Figure 11. Chart showing how much time during the 41-minute crash trip that, while Autopilot was active, the driver had his hands on the steering wheel. Visual and auditory warnings are also Figure 7. Closeup view of impact damage to the right side of the Utility semitrailer. The arrow indicates a segment of frontindicated. windshield (Timing trim from theprovided Tesla entrapped is based in the forwardon vehicle-most area data of and is approximate and relative.) damage. System Performance Data. The vehicle performance data revealed the following:

The crash-involved Tesla’s last trip began at 3:55:23 p.m. The car was stopped or nearly7 stopped about 4:19 p.m. and again about 4:30 p.m. The collision with the truck occurred at 4:36:12.7, as indicated by fault codes and system disruptions.

The last driver input before the crash was to increase the TACC speed to 74 mph at 4:34:21, which was 1 minute 51 seconds before the crash. After that input, there was no driver interaction with Autopilot, no change in steering angle, and no brake lamp switch activation until the collision.

During the last trip, TACC detected a vehicle ahead of the Tesla seven times. For the final 1 minute 35 seconds preceding the crash, TACC detected no lead vehicle in front of the Tesla.

About 9.7 seconds before the collision, the motor torque demand steadily decreased (indicating that the vehicle was on a descending grade). The reported torque demand dropped to zero at the time of the first fault report.

No brakes were applied before or during the collision.

Vehicle headlights were not on at the time of the collision.

The driver was wearing his seat belt during the trip.

Throughout the approach to the collision with the truck, the electronic power assist steering exhibited no substantial changes in steering angle.

There was no record indicating that the Tesla’s automation system identified the truck that was crossing in the car’s path or that it recognized the impending crash. Because the system did not detect the combination vehicle—either as a moving hazard or as a stationary object—Autopilot did not reduce the vehicle’s speed, the FCW did not

15 Trust Calibration is Critical

Trust Overtrust

Undertrust

Trustworthiness Hacking & glitches Computers make mistakes

Errors & failures Safer than human

Tested for long time Feel uncomfortable

Topic modeling of Trust when mature Control until proven ~10k responses to Technology improving JDPower TechChoice Scary drivers and robots survey Many things go wrong

Works good?

Selfdriving accidents

Lee, J. D., & Kolodge, K. (2019). Exploring trust in self-driving vehicles with text analysis. Human Factors, in review. Psychophysics of Dread Risk Undermine Trust

• Dread risk perceived as 1000 times greater than controlled risk (Slovic, P. (1987). Perception of risk. Science, 236(4799), 280–285) • Dread risk guides society to risky outcomes (Gigerenzer, G. (2004). Dread risk, September 11, and fatal traffic accidents. Psychological Science, 15(4)) • Trust declines with surprisingly poor behavior: Ease misses Madhavan, P., Weigmann, D. A., & Lacson, F. C. (2006). Automation failures on tasks easily performed by operators undermine trust in automated aids. Human Factors, 48(2), 241–256

Small changes produced 100% misclassification

Evtimov, I., Eykholt, K., Fernandes, E., Kohno, T., Li, B., Prakash, A., Song, D. (2017). Robust physical-world attacks on machine learning models. Help People See Benefits Trusting Vehicle Technology

Basis of Trust Dimensions of Trust Societal Purpose–Betrayal Trust Relational Acceptance Experiential Process–Violation

Performance–Disappointment Perceived Risk

Sense of Control

Lee, J. D. (2020). Trust in automated, intelligent, and connected vehicles. In D. L. Fisher, W. J. Horrey, J. D. Lee, & M. A. Regan (Eds.), Handbook of Human Factors for Automated, Connected, and Intelligent Vehicles. CRC Press.

1 Trusting in Increasingly Autonomous Vehicles

• Trusted and trustworthy technology Calibrated with capability and aligned with goals

• Trust in multi-echelon networks Trusters beyond the direct users to include other road users Trust basis beyond vehicle sensors to include policy makers

• Trust based on societal, relational, and experiential factors John D. Lee Avoid dread risk with transparency and control University of Wisconsin [email protected] : jdlee888 Data for AV Integration Accelerating Automated Vehicle Acceptance

ARIEL GOLD DATA PROGRAM MANAGER U.S. DEPARTMENT OF TRANSPORTATION (USDOT) INTELLIGENT TRANSPORTATION SYSTEMS (ITS) JOINT PROGRAM OFFICE (JPO)

JULY 14, 2020 AV 4.0 & Data • Builds upon AV 3.0 by expanding the scope to 38 relevant United States Government (USG) components • Highlight crosscutting data-related items: • Privacy and data security • Consistent standards and policies • Multipronged approach to advance AI • Connectivity and data exchanges • Features efforts aimed at enabling voluntary data exchanges Source: USDOT https://www.transportation.gov/av/4

https://www.transportation.gov/av/data/wzdx U.S. DOT’s Data for AV Integration (DAVI) Initiative

Source: USDOT https://www.transportation.gov/av/data

https://www.transportation.gov/av/data https://www.transportation.gov/av/data/wzdx DAVI Website

DAVI Overview Guiding Principles DAVI Framework

Source: USDOT https://www.transportation.gov/av/data

https://www.transportation.gov/av/data/wzdx 4 DAVI Framework

Source: USDOT https://www.transportation.gov/av/data https://www.transportation.gov/av/data/wzdx 5 The Work Zone Data eXchange (WZDx)

Source: Work Zone Data Working Group https://github.com/usdot-jpo-ode/jpo-wzdx

https://www.transportation.gov/av/data https://www.transportation.gov/av/data/wzdx 6 WZDx Demonstration Grants

• Total funding: $2.4M • Number of Awards: Up to 12 • Potential Award Amounts: Up to $200,000 each • Period of performance: 12 months • Cost Share: 20% Non-Federal Share • Federal involvement: Performance monitoring, technical guidance, and participation in status meetings, workshops, and technical group discussions. https://www.grants.gov/web/grants/view-opportunity.html?oppId=327731

Source: ITS JPO

https://www.transportation.gov/av/data/wzdx Utilizing Common Work Zone Event Data for V2x and Cooperative ADS Applications Improved Data Specifications & Tools

Types of content needs Develop tools to collect IOO Work Zone Driving Task Driving Task spatial data from the Speed Limit Changes Field Data Questions Content Needs field to support work Collection zone data collection Lane Closures Where am I relative to my environment? Data Fusion & Decision Signal Locations Identify improvements WZDI Program Dynamic to spatial data What are the rules of the Work Zones Environment Data road that affect path? elements for work zone events Road Furniture Geometry Changes WZDx Specification What’s changed from what I already know? Road Network Road Usage Restrictions Develop software translators for V2X and CARMA3 cooperative ADS Signal and VMS applications status/translation

Source: FHWA

https://www.transportation.gov/av/data/wzdx 8 Cooperative ADS as a Component of Work Zone Event Data

• TMC Operator/IOOs enter basic information about work zone.

• Data is consistent with WZDx spec v2.0.

• Data collection automatically starts/ends when set starting/ending locations are reached.

• User interface to select current state of road/work zone.

Copyright: https://github.com/TonyEnglish/V2X-manual-data-collection

https://www.transportation.gov/av/data/wzdx 9 Cooperative ADS as a Component of Work Zone Event Data • Received information is used to generate a work zone with new geospatial details in the back office (cloud) for validation. • Overlay WZ Map information. • TMC Operator verifies accuracy of recorded work zone. • TMC Operator publish verified work zones available for 3rd parties, 511, etc. • Repository available at https://github.com/TonyEnglish/V2X- manual-data-collection

Copyright https://github.com/TonyEnglish/V2X-manual-data-collection

https://www.transportation.gov/av/data/wzdx 10 Open Training Data Sets

• Many outside the federal government are contributing open training data sets that assist with and other core Automated Driving System (ADS) functions • Open training data sets: • BDD100K from the University of California at Berkeley • Waymo Open Dataset • Level 5 Dataset • Audi AEV Autonomous Driving Dataset • Ford Autonomous Vehicle Dataset

• Contact [email protected] to share other Source: University of California at Berkeley (https://bdd-data.berkeley.edu/) examples of open training data sets

https://www.transportation.gov/av/data/wzdx 11 For More Information

Ariel Gold Twitter: @ITSJPODirector Data Program Manager Facebook: www.facebook.com/DOTRITA U.S. Department of Transportation Intelligent Transportation Systems Joint Program Office Website: www.its.dot.gov [email protected]

https://www.transportation.gov/av/data/wzdx ADAS Ratings for Consumer Information Program

Accelerating Automated Vehicle Acceptance TRB Webinar July 14, 2020

David Harkey Insurance Institute for Highway Safety

iihs.org IIHS consumer ratings

4200+ 2019 models rated 410 evaluations per model 4460 new ratings in 2019 IIHS crash testing programs Side impact Roof strength 2003 2009

1995 2004 2012 2017 Moderate overlap Rear Small overlap front: Small overlap front: front (whiplash mitigation) driver-side passenger-side Effect of crash avoidance systems on claim frequency Results pooled across automakers

40% Collision Property damage liability Bodily injury liability MedPay PIP

20%

0%

-20%

-40% forward collision front curve-adaptive lane departure blind spot parking rear rear warning autobrake headlights warning warning sensors camera autobrake Most crash avoidance technologies are living up to expectations Effects on relevant police-reported crash types

all severities injury statistically significant 20%

0%

-20%

-40%

-60%

-80% forward collision warning low-speed autobrake FCW with autobrake lane departure warning side-view assist (blind spot) 2020 TOP SAFETY PICK requirements

Good ratings in the driver-side small overlap front, passenger side small overlap G front, moderate overlap front, side, roof strength and head restraint tests

Advance or superior rating for front crash prevention as Optional equipment

Advanced or superior rating for pedestrian AEB as Optional equipment

G A Good or acceptable headlight as Standard equipment

Good ratings in the driver-side small overlap front, passenger side small overlap G front, moderate overlap front, side, roof strength and head restraint tests

Advanced or superior rating for front crash prevention as Optional equipment

Advanced or superior rating for pedestrian AEB as Optional equipment

G A Good or acceptable headlight as Optional equipment Speed reduction in 12 and 24 mph tests

Volvo S60 Dodge Durango Subaru Outback (2 points advanced) (3 points advanced (6 points superior)

12 mph test ü ü (speed reduction) 12 mph 6 mph 12 mph

24 mph test ü (speed reduction) 2 mph 9 mph 12 mph Front crash prevention: vehicle-to-vehicle ratings 2013 – 20 models, as of July 2020

100%

80%

60%

40%

20%

0% 2013 2014 2015 2016 2017 2018 2019 2020 Pedestrian test scenarios Pedestrian crash prevention ratings As of November 2019

8

small SUVs 6 midsize sedans

4

2

0 Superior Advanced Basic No credit Adult walking from the right side 25 mph condition Adult walking from the right side 25 mph condition Adult walking from the right side 25 mph condition Adult walking from the right side 25 mph condition Child running from the right side 12 mph condition Child running from the right side 12 mph condition Child running from the right side 12 mph condition Child running from the right side 12 mph condition Rear crash prevention ratings 4 Rear parking sensors 4 Rear cross traffic alert 4 Rear autobrake

reversing car-to-car, 16” overlap reversing car-to-car, 45° angle

reversing car-to-car, 10° angle reversing toward fixed pole Functional performance and user experience

2016 Tesla Model S 2017 BMW 5 series 2017 Mercedes E-Class with Autopilot with Driving with Drive Pilot software ver. 7.1 Assistant Plus

2018 Volvo S90 2018 Tesla Model 3 with Pilot Assist with Autopilot software ver. 8.1 Lane keeping in curves

100%

80%

disengaged 60% crossed lane line

on lane line 40% remained in lane

20%

0% BMW 5 series Volvo S90 Mercedes E-Class Tesla Model S Tesla Model 3 n=16 n=17 n=17 n=18 n=18 Lane keeping on hills

100%

80%

disengaged 60% crossed lane line

on lane line 40% remained in lane

20%

0% BMW 5 series Tesla Model S Volvo S90 Mercedes E-Class Tesla Model 3 n=14 n=18 n=17 n=18 n=18 Adaptive trusted more than active lane keeping Percentage of drivers who agreed or strongly agreed

100 I trust the automation to maintain speed and distance to vehicle ahead I trust the automation to keep me in center of lane 80

60

40

20

0 Tesla Model S Volvo S90 BMW 5 series Infiniti QX50 Mercedes E-Class Autopilot Pilot Assist Driving Assistant Plus ProPilot Assist Drive Pilot IIHS consumer information does more… Since 1995

4Empower consumers 4Identify gaps in safety regulations and testing programs 4Encourage automakers

4Accelerate technology integration! Thank you Today’s Panelists • Moderator: Cynthia Jones, DriveOhio • John Lee, University of Wisconsin- Madison • Valerie Shuman, Shuman Consulting Group • Ariel Gold, US DOT • David Harkey, Insurance Institute for Highway Safety Get Involved with TRB Receive emails about upcoming TRB webinars https://bit.ly/TRBemails #TRBwebinar

Find upcoming conferences http://www.trb.org/Calendar Get Involved with TRB #TRBwebinar

Getting involved is free!

Be a Friend of a Committee bit.ly/TRBcommittees – Networking opportunities – May provide a path to Standing Committee membership

Join a Standing Committee bit.ly/TRBstandingcommittee

Work with CRP https://bit.ly/TRB-crp

Update your information www.mytrb.org #TRB100