Why Adas and Autonomous Vehicles Need Thermal Infrared Cameras
Total Page:16
File Type:pdf, Size:1020Kb
THERMAL FOR ADAS AND AV PART 1 WHY ADAS AND AUTONOMOUS VEHICLES NEED THERMAL INFRARED CAMERAS A CHALLENGING REQUIREMENT 05 CALLS FOR ADVANCED TECHNOLOGY FULL Safe advanced driver assist system (ADAS) The Governors Highway Safety Association vehicles and autonomous vehicles (AV) require states the number of pedestrian fatalities in sensors to deliver scene data adequate for the U.S. has grown substantially faster than 04 the detection and classification algorithms to all other traffic deaths in recent years. They autonomously navigate under all conditions for now account for a larger proportion of traffic SAE automation level 5.1 This is a challenging fatalities than they have in the past 33 years. HIGH requirement for engineers and developers. Pedestrians are especially at risk after dark SELF-DRIVING UBER CAR Visible cameras, sonar, and radar are already when 75% of the 5,987 U.S. pedestrian 03 INVOLVED IN FATAL ACCIDENT in use on production vehicles today at SAE fatalities occurred in 2016.2 Thermal, or MARCH 2018 automation level 2. SAE automation levels 3 longwave infrared (LWIR), cameras can detect CONDITIONAL and 4 test platforms have added light detection and classify pedestrians in darkness, through and ranging (LIDAR) to their sensor suite. most fog conditions, and are unaffected by Each of these technologies has strengths and sun glare, delivering improved situational TESLA HITS COP CAR WHILE 02 ALLEGEDLY ON AUTOPILOT weaknesses. Tragically, as shown in recent awareness that results in more robust, reliable, MAY 2018 Uber and Tesla accidents, the current sensors and safe ADAS and AV. LEVELS SAE AUTOMATION in SAE level 2 and 3 do not adequately detect PARTIAL cars or pedestrians. PERCENT OF 2016 PEDESTRIAN FATALITIES 01 80% 60% DRIVER ASSISTANCE 40% 20% 0% DARK DAYLIGHT DAWN/DUSK Figure 2. 2016 pedestrian fatalities by light level. Source: Governors Highway Safety Association NONE 1 SAE International and J3016, Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems, https://web.archive.org/ web/20170903105244/https://www.sae.org/misc/pdfs/automated_driving.pdf Figure 1. Recent Uber and Tesla accidents show 2 Richard Retting and Sam Schwatz, Governors Highway Safety Association Pedestrian Traffic Fatalities by State (2017 Preliminary Data) https:// the need for a higher-performance ADAS sensor www.ghsa.org/sites/default/files/2018-02/pedestrians18.pdf suite in SAE levels 2 and greater. PART 1 | WHY ADAS AND AUTONOMOUS VEHICLES NEED THERMAL IMAGING SENSORS ANIMAL & ADAPTIVE THERMAL MATCH THE RIGHT TECHNOLOGY OBJECT CRUISE PEDESTRIAN WITH THE OPTIMAL APPLICATION DETECTION CONTROL DETECTION ADAS and AV platforms use several performance, but at ranges beyond technologies (Table 1), and the core approach approximately 165 feet (50 meters), thermal COLLISION EMERGENCY AVOIDANCE BRAKING is to detect and subsequently classify objects cameras significantly outperform low-light SITUATIONAL AWARENESS to determine a course of action. For example, visible cameras and deliver more consistent (Location) radar and LIDAR systems generate a point- imagery in all lighting conditions. density cloud from the reflections they gather LANE DEPARTURE PEDESTRIAN 3 and calculate an object range and closing The NTSB Report on the Uber incident in WARNING SIGN DETECTION RECOGNITION speed. To generate the amount of data needed Tempe, Arizona—in which a pedestrian was FRONT CROSS for object classification in a cost-effective and fatally struck by a developmental, SAE-level-3 TRAFFIC ALERT reliable solution, radar and LIDAR are fused autonomous car using LIDAR, radar, and with the output from visible and thermal visible sensors—revealed that the pedestrian cameras to cover all driving conditions. was first classified as an unknown object, SURROUND SURROUND then a car, and then a bicycle before finally VIEW FACE VIEW & EYE Classification is challenging in poor lighting being classified as a person. FLIR re-created TRACKING BLIND SPOT BLIND SPOT conditions, nighttime driving, blinding sun this accident using a wide field of view (FOV) DETECTION DETECTION glare, and inclement weather. Thermal FLIR ADK™ and a basic classifier. The thermal sensors improve on the ability to see in camera system classified the pedestrian at REAR CROSS darkness, through most fog and sun glare, approximately 280 feet (85.4 meters), more TRAFFIC ALERT THERMAL 360 and reliably classify vehicles, people, animals, than twice the required “fast-reaction” PARK ASSIST & and other objects in these common driving stopping distance for a human driving at 43 SURROUND VIEW conditions. Furthermore, thermal cameras mph4 (126 feet or 38.4 meters). Additional perform equally well in daytime driving, testing with narrower-FOV thermal cameras LONG-RANGE RADAR FORWARD LIDAR offering redundancy for a visible camera. has demonstrated pedestrian classification at CAMERA SHORT/MEDIUM RANGE RADAR greater than 200 meters, which is four times ULTRASOUND Low-light visible cameras, coupled with farther than typical headlights and visible THERMAL IMAGING LIDAR and radar, provide baseline nighttime cameras can typically see. Figure 3. Thermal sensors add reliability and Table 1. Detector technologies and application summary improve performance of the ADAS and AV sensor suites Application Visible Thermal Radar LIDAR Ultrasound Traffic Sign Recognition x Adaptive Cruise Control x Lane Departure Warning x Front Cross Traffic Alert x x Emergency Brake Assist x x x x Pedestrian/Animal Detection x x x Pedestrian/Animal Classification x x Night Vision x Blind Spot Detection x x x Rear Collision Warning Figure 4. A wide-FOV FLIR ADK classified a person x at 280 feet, twice the needed stopping distance, in Park Assist x x the recreation of an Uber accident in Tempe, Arizona. Mapping/Location x Rear Cross Traffic Alert x x x Rear AEB x Collision Avoidance x x x x Surround View x x 3 https://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf Figure 5. Fog-tunnel testing demonstrates 4 http://www.brakingdistances.com significant visibility improvement with thermal versus visible cameras. PART 1 | WHY ADAS AND AUTONOMOUS VEHICLES NEED THERMAL IMAGING SENSORS LIGHT TRANSMISSION THROUGH FOG “SEEING” HEAT INSTEAD FOG CATAGORY VISUAL LWIR Catagory 1 1.22 5.9 - 10.1 Catagory 2 0.62 2.4 OF RELYING ON LIGHT Catagory 3 0.305 0.293 Catagory 4 0.092 0.087 Thermal sensing boosts the performance As recent events and our own driving COMPARISON OF VISIBLE of ADAS and AV platforms in inclement experience demonstrate, it can be challenging AND IR IN FOG weather, including dust, smog, fog, and light to see pedestrians day or night. There are VISIBLE LIGHT A FROM OBJECT rain conditions. LWIR thermal sensors are numerous cases where the combination of SCATTERED AND ABSORBED BY FOG B HEADLIGHTS completely passive, a key advantage over driving conditions and environment illustrate REFLECTED FROM ACTIVE ILLUMINATION visible cameras, LIDAR, and radar. Target the need for thermal cameras. In fact, there C INFRARED LIGHT LESS ABSORPTION BY FOG reflectivity and atmospheric effects can create are times that a thermal camera may be the variables in sensor performance, particularly at only detection and classification technology the limits of their operating envelope. that works. • Visible-light cameras depend on light from A radar or LIDAR signal from a pedestrian the sun, street lights, or headlights reflected can be camouflaged by a nearby vehicle’s off objects and received by the sensor. competing signal. If a pedestrian is crossing • LIDAR sensors emit laser light energy and between two cars or is partially obstructed by FOG ABSORPTION process the reflected energy by measuring foliage there will be little to no reflected signal B the time of flight of the illumination source. to detect the pedestrian. In such cases, as in search and rescue or military applications, • Radar emits radio signals and processes the A C return signal. thermal cameras can see through light foliage. Because thermal sensors see heat, Thermal imaging takes advantage of the VISIBLE IR not visible shapes, they have an advantage CAMERA CAMERA fact that all objects emit thermal energy over visible cameras in classifying partially and, therefore, eliminates reliance on an occluded people and animals. The heat from a illumination source. Figure 6 illustrates how person or animal makes them stand out from passive sensing benefits thermal over visible a cluttered background or partially obstructing sensors in light to moderate fog where a foreground, as shown in Figure 6. thermal camera can see at least four times Figure 5. Passive thermal sensing can detect farther than a visible camera per the TACOM pedestrians from distances four times farther than a visible sensor through light to moderate Thermal Image Model (TTIM). fog – day or night per the TACOM Thermal Image Model (TTIM) READY FOR ALL DRIVING CONDITIONS The primary challenge of ADAS and AV The current standard sensor suite does not platforms is being prepared for all driving completely address the requirements for SAE conditions. The road is full of complex, level 3 and greater. Thermal cameras can see unpredictable situations, and cars must be pedestrians up to four times farther away than equipped with cost-effective sensor suites a visible camera in darkness and through sun capable of collecting as much information as glare and most fog. They provide an excellent possible to make the right decision every time. orthogonal and redundant sensing modality