Sensor Fusion and Tracking for Autonomous Systems
Marc Willerton Senior Application Engineer MathWorks
© 2015 The MathWorks, Inc.1 Abstract
▪ There is an exponential growth in the development of increasingly autonomous systems. These systems range from road vehicles that meet the various NHTSA levels of autonomy, through consumer quadcopters capable of autonomous flight and remote piloting, package delivery drones, flying taxis, and robots for disaster relief and space exploration. Work on autonomous systems spans industries and includes academia as well as government agencies. ▪ In this talk, you will learn to design, simulate, and analyze systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. By fusing multiple sensors data, you ensure a better result than would otherwise be possible by looking at the output of individual sensors. ▪ Several autonomous system examples are explored to show you how to: – Define trajectories and create multiplatform scenarios – Simulate measurements from inertial and GPS sensors – Generate object detections with radar, EO/IR, sonar, and RWR sensor models – Design multi-object trackers as well as fusion and localization algorithms – Evaluate system accuracy and performance on real and synthetic data
2 Capabilities of an Autonomous System
Sense
3 Capabilities of an Autonomous System
Sense
Perceive
4 Capabilities of an Autonomous System
Sense
Perceive
Decide & Plan Learning Algorithms Optimization
5 Capabilities of an Autonomous System
Sense
Perceive
Decide & Plan
Act Control Algorithms
6 Agenda
▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A
7 Agenda
▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A
8 Sensor fusion and tracking is…
Self- awareness Situational awareness
Accelerometer, Magnetometer, Radar, Camera, IR, Sonar, Lidar, Gyro, GPS… …
Signal and Image Sensor Fusion and Control Processing Tracking
9 Timeline of Technology Advances
Multi-object tracking
Air Traffic Control Computer Vision Multi-sensor Fusion for Transportation for Autonomous Systems
Localization
Military Commercial Ubiquitous Today Timeline 10 Fusion Combines the Strengths of Each Sensor Legend Vision measurement Down range at time step k Vision Measurement
Radar measurement Radar at time step k measurement
Fused Track (fused Predicted estimate estimate at estimate) at time step k time step k Ellipse represents uncertainty
Fused estimate at time step k-1 Cross range
11 What is Localization?
Inertial Sensor Attitude Position
12 Agenda
▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A
13 Sensor Models for Sensor Fusion and Tracking
14 Exploring gyro model in Sensor Fusion and Tracking Toolbox
15 Exploring gyro model in Sensor Fusion and Tracking Toolbox Imposing ADC Quantisation Effects Imposing White Noise
16 Exploring gyro model in Sensor Fusion and Tracking Toolbox Imposing Brown Noise Imposing Pink Noise
17 Exploring gyro model in Sensor Fusion and Tracking Toolbox Imposing Temperature Scaled Bias
For more information, see example.
18 Levels of Fidelity: Radar Detections vs. I/Q Samples Simulation
radar = monostaticRadarSensor(‘Rotator’) radar = monostaticRadarSensor(‘Sector’)
radar = monostaticRadarSensor(‘Mechanical Raster’) radar = monostaticRadarSensor(‘Electronic Raster’) 19 Generating Radar Detections in MATLAB
Sensor ID
Detections (time, measurement, etc…) Target positions Simulation time
For more information, see example here TBC – we will see how to build
this as a scenario later… 20 Fusing Sensor Data Improves Localization Ground truth vs. Estimate Sensors Error Measurements
Example here 21 Fuse IMU & GPS for Self-Localization of a UAV
Sense
Perceive Locate Track Self Obstacles
Decide & Plan
Act
Example here 22 Fuse IMU & Odometry for Self-Localization in GPS-Denied Areas
VO estimate off by a scale factor
IMU dead reckoning drift
24 Fuse IMU & Odometry for Self-Localization in GPS-Denied Areas
Sense
Perceive Locate Track Self Obstacles
Decide & Plan
Act
25 Flexible Workflows Ease Adoption: Wholesale or Piecemeal
Recorded Sensor Data
Scenario Definition and Sensor Simulation Algorithms Documented Documented Interface Interface Visualization Ownship INS Sensor for detections for tracks & Trajectory Simulation gnnTrackerINSgnnTracker Filter, Generation Tracker, etc.. Metrics
Actors/ Radar, IR, Platforms & Sonar Sensor Simulation
26 Stream Data to MATLAB from IMUs Connected to Arduino
MEMS Devices ▪ 9-axis (Gyro + Accelerometer + Compass)
▪ 6 axis (Gyro + Accelerometer)
▪ Up to 200 Hz sampling rate
27 New Hardware and Multisensor Positioning Examples
28 Agenda
▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A
29 Building a Simulation Scenario
Example here
30 Test Tracker Performance on Pre-Built Benchmark Trajectories
Reference W.D. Blair, G. A. Watson, T. Kirubarajan, Y. Bar-Shalom, "Benchmark for Radar Allocation and Tracking in ECM." Aerospace and Electronic Systems IEEE Trans on, vol. 34. no. 4. 1998 31 To go further on localization, see also
32 Agenda
▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A
33 A Multi-object Tracker is More than a Kalman Filter
Multitarget Tracker
Track Detections Association Tracking Tracks and Filter Management From various sensors at various update rates
▪ Assigns detections to tracks ▪ Creates new tracks ▪ Fuses measurements with ▪ Updates existing tracks the track state ▪ Removes old tracks
34 Tracking Algorithm Development Workflow
Recorded objectDetection tracks Sensor Data
Tracking Algorithms Scenario Definition and Sensor Simulation Visualization & Ownship INS Sensor GNN,gnnTrackergnnTracker MHT, etc.. Trajectory Simulation Metrics Generation
Actors/ Radar, IR, & Platforms Sonar Sensor Simulation
, JPDA, PHD
35 Components of a Multi-Object Tracker
Inertial Navigation System provides radar sensor platform position, velocity and orientation Track Association and Management
Tracking Filter
More information here 36 Components of a Multi-Object Tracker Tracking Filter Covariance estimate assumes system is linear: time step process noise If y (t) = H{x }, y (t) = H{x }, then state 1 1 2 2 (e.g. model error) Ay1(t) + By2(t) = H{Ay1(t) + By2(t)}
Linear Kalman Filter – Assumes linear system output Extended Kalman Filter – Linearizes the nonlinear Measurement system around state estimate (sensor) noise Unscented Kalman Filter – Samples the covariance Common assumption: distribution and propagates through non-linear model Gaussian noise => Kalman Filter Gaussian Sum Filter – good for partially observable cases (e.g. range only measurements) Linear Kalman Filter Example Interactive Multiple Model Filter – good for tracking manoeuvring targets Particle Filter – doesn’t require gaussian noise 1. Create the measured positions from a constant-velocity trajectory State Initialisation (e.g. constant velocity/acceleration/turn and Motion Models available or use your own
2. Specify initial position and velocity 3. Run Kalman Filter
More information here 37 Components of a Multi-Object Tracker Track Association and Management One or more sensors generating multiple detections from multiple targets – detections must be:
1. Gated – determine which detections are valid candidates to update existing tracks Lowest 2. Assigned* – make a track to detection assignment. Assignment approaches include: Complexity > Global Nearest Neighbour – Minimise overall distance of track to detection assignments > Joint Probability Data Association – Soft assignment so all gated detections make weighted contributions to a track > Track Orientated Multiple Hypothesis Tracking – Allows data association to be postponed until more information is received
Best Track maintenance is required for creation (tentative status), confirmation, deletion of tracks (after coasting) Performance > Can use history or score based logic
Advanced Topic – Track to Track Fusion:
* Some trackers (e.g. PHD More information here Filter) don’t require assignment 38 More information here Components of a Multi-Object Tracker Track Association and Management Data
Track Creation Time
Track Deletion
39 Example of Multi-Object Tracking: Multifunction Radar: Search and Track Search
Track (confirm)
Track (update)
Example here 40 Example of Multi-Object Tracking: Multifunction Radar: Search and Track
41 Example of Multi-Object Tracking: Multifunction Radar: Search and Track
Target 1 Detected
42 Example of Multi-Object Tracking: Multifunction Radar: Search and Track Detection Confirmed and Track 1 Created
43 Example of Multi-Object Tracking: Multifunction Radar: Search and Track
Track 1 Updated
44 Example of Multi-Object Tracking: Performing What-If Analysis
Did the trajectories cross?
Two targets seen as one by the radar
+ 270m at 30km
1° Azimuth Resolution
Example here 45 Example of Multi-Object Tracking: Performing What-If Analysis
tracker = trackerGNN( ... 'FilterInitializationFcn',@initCVFilter,... 'MaxNumTracks', numTracks, ... 'MaxNumSensors', 1, ... 'AssignmentThreshold',gate, ... 'TrackLogic', 'Score', ... 'DetectionProbability', pd, ... 'FalseAlarmRate', far, ... 'Volume', vol, 'Beta', beta);
+ 175m tracker = trackerGNN( ... at 10km 'FilterInitializationFcn',@initIMMFilter,... 'MaxNumTracks', numTracks, ... 'MaxNumSensors', 1, ... + 9m 'AssignmentThreshold',gate, ... at 1km 'TrackLogic', 'Score', ... 'DetectionProbability', pd, ... 'FalseAlarmRate', far, ... 'Volume', vol, 'Beta', beta);
46 Example of Multi-Object Tracking: Performing What-If Analysis
tracker = trackerTOMHT( ... 'FilterInitializationFcn',@initIMMFilter,... 'MaxNumTracks', numTracks, ... 'MaxNumSensors', 1, ... 'AssignmentThreshold’,[0.2,1,1]*gate, ... 'TrackLogic', 'Score', ... 'DetectionProbability', pd, ... 'FalseAlarmRate', far, ... 'Volume', vol, 'Beta', beta, ... 'MaxNumHistoryScans', 10, ... 'MaxNumTrackBranches', 5,... 'NScanPruning', 'Hypothesis', ... 'OutputRepresentation', 'Tracks');
47 Example of Multi-Object Tracking: Comparing Trackers and Tracking Filters
False track Dropped track
Slower Faster
48 Simulink Support for Multi-Object Tracking
49 Point object vs. Extended object
▪ Point object ▪ Extended object – Distant object represented as a single point – High resolution sensors generate – One detection per object per scan multiple detections per object per scan
50 Extended Object Tracking Example
Sense
Perceive Locate Track Self Obstacles
Decide & Plan
Act
51 Tracking with Lidar (Even more points!!) JPDA Tracker with IMM
Sense
Perceive Locate Track Self Obstacles
Decide & Plan
Act
Pre-process point cloud data to extract objects of interest. Example here. 52 To go further on tracking with a single sensor, see also
53 Agenda
▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A
54 Multi-platform Scenario Generation
Moving AirborneMoving Airborne Radar (red Radar) (blue) 2 ULAs mounted360° mechanical above fuselage scan in az ElectronicallyNo electronic scan 120 scanning° az sector on both sides of airframe
Stationary Ground Based Radar (yellow) Electronically scanned URA Raster scan surveying +/-60° az and -20 to 0° el Example here 55 Multi-platform Scenario Generation Visualize Detections and Measurement Uncertainties
Ellipsoids represent uncertainties
Airborne ULA can’t measure Elevation
Mechanically scanning radar detects target only 2 times
Example here 56 Multi-platform Scenario Generation Visualize Track Accuracy
Good track altitude estimation despite poor measurements
Motion model mismatch (CV vs. CT)
57 Multi-platform Scenario Generation Tune and Compare Trackers with Assignment Metrics
7 objects (P1, P2, …)
Track T09 for P1
Time to confirm tracks 58 Multi-platform Scenario Generation Assess Tracker Performance with Assignment Metrics
Dropped track
Dropped False track track
False track
59 Multi-platform Scenario Generation Jamming Scenario
Jammer prevents detection
Example here 60 Multi-platform Scenario Generation Positioning with direction only measurements
Where are the real targets? How to remove the ghosts?
Example here 61 Agenda
▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A
62 Simulate a lane detection and lane following system
Sense Simulation Environment
Scenarios Monocular Scenery Perceive camera lane Locate Track detector Self Obstacles Actions Actors & Events Decide Lane & Plan Goals & Sensors following Metrics controller Act
63 Simulate a lane detection and lane following system
Monocular camera lane detector - Based on shipping example Visual Perception Using - Lane rejection and tracking added to Monocular Camera improve performance Automated Driving ToolboxTM
64 Simulate a lane detection and lane following system Challenge with noisy lanes
Approaching car gets confused with road
65 Simulate a lane detection and lane following system Integrate noisy lane rejection and tracking Before correction
helperMonoSensorWrapper System Object Vehicles & helperMonoSensor Lanes System object Lanes Lanes
“EnableLaneTracker” Property of to enable noisy lane helperMonoSensorWrapper rejection
Method of rejectInvalidLanes helperMonoSensorWrapper
With correction
66 Simulate a lane detection and lane following system Integrate noisy lane rejection and tracking
Example: Before Correction Example: After Correction
67 Simulate a lane detection and lane following system Tracker Setup and Configuration
Kalman Tracker
• Two instances of Kalman tracker to track left and right lanes independently. • Initialize trackers for first valid lanes using “ConfigureKalmanFilter” • Input Parameters for tracking: A,B,C coefficients of parabolic lane boundaries • Motion model : Constant Acceleration • Correct tracker for every valid lane
69 Simulate a lane detection and lane following system Steering Angle deviation for simulation run
Without noisy lane rejector With noisy lane rejector
70 Sensor Fusion and Tracking …
Sense
Perceive
Decide & Plan
Act
Is Ubiquitous Leverages Sensor Strengths Enables Autonomy
Signal and Image Sensor Fusion Control Processing and Tracking
71 Q&A
Marc Willerton ([email protected]) 72