Sensor Fusion and Tracking for Autonomous Systems

Sensor Fusion and Tracking for Autonomous Systems

Sensor Fusion and Tracking for Autonomous Systems Marc Willerton Senior Application Engineer MathWorks © 2015 The MathWorks, Inc.1 Abstract ▪ There is an exponential growth in the development of increasingly autonomous systems. These systems range from road vehicles that meet the various NHTSA levels of autonomy, through consumer quadcopters capable of autonomous flight and remote piloting, package delivery drones, flying taxis, and robots for disaster relief and space exploration. Work on autonomous systems spans industries and includes academia as well as government agencies. ▪ In this talk, you will learn to design, simulate, and analyze systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. By fusing multiple sensors data, you ensure a better result than would otherwise be possible by looking at the output of individual sensors. ▪ Several autonomous system examples are explored to show you how to: – Define trajectories and create multiplatform scenarios – Simulate measurements from inertial and GPS sensors – Generate object detections with radar, EO/IR, sonar, and RWR sensor models – Design multi-object trackers as well as fusion and localization algorithms – Evaluate system accuracy and performance on real and synthetic data 2 Capabilities of an Autonomous System Sense 3 Capabilities of an Autonomous System Sense Perceive 4 Capabilities of an Autonomous System Sense Perceive Decide & Plan Learning Algorithms Optimization 5 Capabilities of an Autonomous System Sense Perceive Decide & Plan Act Control Algorithms 6 Agenda ▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A 7 Agenda ▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A 8 Sensor fusion and tracking is… Self- awareness Situational awareness Accelerometer, Magnetometer, Radar, Camera, IR, Sonar, Lidar, Gyro, GPS… … Signal and Image Sensor Fusion and Control Processing Tracking 9 Timeline of Technology Advances Multi-object tracking Air Traffic Control Computer Vision Multi-sensor Fusion for Transportation for Autonomous Systems Localization Military Commercial Ubiquitous Today Timeline 10 Fusion Combines the Strengths of Each Sensor Legend Vision measurement Down range at time step k Vision Measurement Radar measurement Radar at time step k measurement Fused Track (fused Predicted estimate estimate at estimate) at time step k time step k Ellipse represents uncertainty Fused estimate at time step k-1 Cross range 11 What is Localization? Inertial Sensor Attitude Position 12 Agenda ▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A 13 Sensor Models for Sensor Fusion and Tracking 14 Exploring gyro model in Sensor Fusion and Tracking Toolbox 15 Exploring gyro model in Sensor Fusion and Tracking Toolbox Imposing ADC Quantisation Effects Imposing White Noise 16 Exploring gyro model in Sensor Fusion and Tracking Toolbox Imposing Brown Noise Imposing Pink Noise 17 Exploring gyro model in Sensor Fusion and Tracking Toolbox Imposing Temperature Scaled Bias For more information, see example. 18 Levels of Fidelity: Radar Detections vs. I/Q Samples Simulation radar = monostaticRadarSensor(‘Rotator’) radar = monostaticRadarSensor(‘Sector’) radar = monostaticRadarSensor(‘Mechanical Raster’) radar = monostaticRadarSensor(‘Electronic Raster’) 19 Generating Radar Detections in MATLAB Sensor ID Detections (time, measurement, etc…) Target positions Simulation time For more information, see example here TBC – we will see how to build this as a scenario later… 20 Fusing Sensor Data Improves Localization Ground truth vs. Estimate Sensors Error Measurements Example here 21 Fuse IMU & GPS for Self-Localization of a UAV Sense Perceive Locate Track Self Obstacles Decide & Plan Act Example here 22 Fuse IMU & Odometry for Self-Localization in GPS-Denied Areas VO estimate off by a scale factor IMU dead reckoning drift 24 Fuse IMU & Odometry for Self-Localization in GPS-Denied Areas Sense Perceive Locate Track Self Obstacles Decide & Plan Act 25 Flexible Workflows Ease Adoption: Wholesale or Piecemeal Recorded Sensor Data Scenario Definition and Sensor Simulation Algorithms Documented Documented Interface Interface Visualization Ownship INS Sensor for detections for tracks & Trajectory Simulation gnnTrackerINSgnnTracker Filter, Generation Tracker, etc.. Metrics Actors/ Radar, IR, Platforms & Sonar Sensor Simulation 26 Stream Data to MATLAB from IMUs Connected to Arduino MEMS Devices ▪ 9-axis (Gyro + Accelerometer + Compass) ▪ 6 axis (Gyro + Accelerometer) ▪ Up to 200 Hz sampling rate 27 New Hardware and Multisensor Positioning Examples 28 Agenda ▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A 29 Building a Simulation Scenario Example here 30 Test Tracker Performance on Pre-Built Benchmark Trajectories Reference W.D. Blair, G. A. Watson, T. Kirubarajan, Y. Bar-Shalom, "Benchmark for Radar Allocation and Tracking in ECM." Aerospace and Electronic Systems IEEE Trans on, vol. 34. no. 4. 1998 31 To go further on localization, see also 32 Agenda ▪ Introduction ▪ Technology overview of perception ▪ Sensor models for sensor fusion and tracking ▪ Building simulation scenarios ▪ Developing a Multi-Object Tracker ▪ Tracking from multiple platforms ▪ Connecting trackers to a control system ▪ Q&A 33 A Multi-object Tracker is More than a Kalman Filter Multitarget Tracker Track Detections Association Tracking Tracks and Filter Management From various sensors at various update rates ▪ Assigns detections to tracks ▪ Creates new tracks ▪ Fuses measurements with ▪ Updates existing tracks the track state ▪ Removes old tracks 34 Tracking Algorithm Development Workflow Recorded objectDetection tracks Sensor Data Tracking Algorithms Scenario Definition and Sensor Simulation Visualization & Ownship INS Sensor GNN,gnnTrackergnnTracker MHT, etc.. Trajectory Simulation Metrics Generation Actors/ Radar, IR, & Platforms Sonar Sensor Simulation , JPDA, PHD 35 Components of a Multi-Object Tracker Inertial Navigation System provides radar sensor platform position, velocity and orientation Track Association and Management Tracking Filter More information here 36 Components of a Multi-Object Tracker Tracking Filter Covariance estimate assumes system is linear: time step process noise If y (t) = H{x }, y (t) = H{x }, then state 1 1 2 2 (e.g. model error) Ay1(t) + By2(t) = H{Ay1(t) + By2(t)} Linear Kalman Filter – Assumes linear system output Extended Kalman Filter – Linearizes the nonlinear Measurement system around state estimate (sensor) noise Unscented Kalman Filter – Samples the covariance Common assumption: distribution and propagates through non-linear model Gaussian noise => Kalman Filter Gaussian Sum Filter – good for partially observable cases (e.g. range only measurements) Linear Kalman Filter Example Interactive Multiple Model Filter – good for tracking manoeuvring targets Particle Filter – doesn’t require gaussian noise 1. Create the measured positions from a constant-velocity trajectory State Initialisation (e.g. constant velocity/acceleration/turn and Motion Models available or use your own 2. Specify initial position and velocity 3. Run Kalman Filter More information here 37 Components of a Multi-Object Tracker Track Association and Management One or more sensors generating multiple detections from multiple targets – detections must be: 1. Gated – determine which detections are valid candidates to update existing tracks Lowest 2. Assigned* – make a track to detection assignment. Assignment approaches include: Complexity > Global Nearest Neighbour – Minimise overall distance of track to detection assignments > Joint Probability Data Association – Soft assignment so all gated detections make weighted contributions to a track > Track Orientated Multiple Hypothesis Tracking – Allows data association to be postponed until more information is received Best Track maintenance is required for creation (tentative status), confirmation, deletion of tracks (after coasting) Performance > Can use history or score based logic Advanced Topic – Track to Track Fusion: * Some trackers (e.g. PHD More information here Filter) don’t require assignment 38 More information here Components of a Multi-Object Tracker Track Association and Management Data Track Creation Time Track Deletion 39 Example of Multi-Object Tracking: Multifunction Radar: Search and Track Search Track (confirm) Track (update) Example here 40 Example of Multi-Object Tracking: Multifunction Radar: Search and Track 41 Example of Multi-Object Tracking: Multifunction Radar: Search and Track Target 1 Detected 42 Example of Multi-Object Tracking: Multifunction Radar: Search and Track Detection Confirmed and Track 1 Created 43 Example of Multi-Object Tracking: Multifunction Radar: Search and Track Track 1 Updated 44 Example of Multi-Object

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    70 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us