Efficient Particle Filter-Based Tracking of Multiple Interacting Targets Using an MRF-Based Motion Model

Efficient Particle Filter-Based Tracking of Multiple Interacting Targets Using an MRF-Based Motion Model

Efficient Particle Filter-Based Tracking of Multiple Interacting Targets Using an MRF-based Motion Model Zia Khan, Tucker Balch, and Frank Dellaert {zkhan,tucker,dellaert}@cc.gatech.edu College of Computing, Georgia Institute of Technology Atlanta, Georgia 30332, USA Abstract— We describe a multiple hypothesis particle filter and will rarely crawl on top of each other. In these cases, for tracking targets that will be influenced by the proximity identity could be maintained during tracking by providing and/or behavior of other targets. Our contribution is to show a more complex motion model that models the interaction how a Markov random field motion prior, built on the fly at each time step, can model these interactions to enable between targets. more accurate tracking. We present results for a social insect tracking application, where we model the domain knowledge Our contribution is to show how a Markov random that two targets cannot occupy the same space, and targets field motion prior, built on the fly at each time step, can will actively avoid collisions. We show that using this model model these interactions. Our approach is based on the improves track quality and efficiency. Unfortunately, the well known particle filter [9], [10], [11], [12], a multi- joint particle tracker we propose suffers from exponential hypothesis tracker that approximates the filtered posterior complexity in the number of tracked targets. An approx- imation to the joint filter, however, consisting of multiple distribution by a set of weighted particles. The standard nearly independent particle filters can provide similar track particle filter weights particles based on a likelihood score, quality at substantially lower computational cost. and then propagates these weighted particles according to a motion model. We show that incorporating an MRF to I. INTRODUCTION model interactions is equivalent to adding an additional This work is concerned with the problem of tracking interaction factor to the importance weights in a joint multiple interacting targets. Our objective is to obtain particle filter. a record of the trajectories of targets over time, and Unfortunately the joint particle filter suffers from ex- to maintain correct, unique identification of each target ponential complexity in the number of tracked targets, n. throughout. Tracking multiple identical targets becomes However, we have found that using n nearly independent challenging when the targets pass close to one another or particle filters, each running in a low-dimensional space, merge. can approximate the optimal joint tracker by computing For non-interacting targets, the classical multi-target the interaction potentials based on single-target predictions tracking literature approaches the problem by perform- from the previous time step. The result is a set of trackers ing a data-association step after a detection step. Most that are coupled only when necessary. The performance notably, the multiple hypothesis tracker [1] and the joint advantage the decoupled trackers in comparison with probabilistic data association filter (JPDAF) [2], [3] are a joint filter is significant. Computational requirements the most influential algorithms in this class. These multi- render the joint filter unusable for more than than three target tracking algorithms have been used extensively in or four targets, while our decoupled tracking system can the context of computer vision. Some examples are the use easily track 20 or more targets. of nearest neighbor tracking in [4], the multiple hypothesis tracker in [5], and the JPDAF in [6]. Recently, a particle We have selected visual animal tracking as a domain to filtering version of the JPDAF has been proposed in [7]. illustrate the approach. In particular, we track a number These approaches are appropriate when targets behave of ants in a small arena. This is not an artificial task: our independently, and the problem is one of visual confusion. long term research goals involve the analysis of multi- However, in some applications, e.g. when tracking social agent system behavior, with social insects as a model. agents, the targets do not behave independently. For exam- The domain offers many challenges that are quite different ple, in our own work on tracking social insects like ants from the typical radar tracking domain in which most and bees [8], the insects effectively live on a 2D plane multi-target tracking algorithms are evaluated. (a) (b) Fig. 1. (a) Particle filtering: a set of particles (white rectangles), are scored according to how well the underlying pixels match the appearance model (left). Particles are resampled (middle) according to the normalized weights determined in the previous step. Finally, the estimated location of the target is computed as the mean of the resampled particles. (b) Motion model: The previous image and particles (left). A new image frame is loaded (center). Each particle is advanced according to a stochastic motion model (right). The samples are now ready to be scored and resampled as above. II. THE PARTICLE FILTER rectangular region about the size of our ant targets. Each Particle filters use multiple discrete “particles” to rep- target is tracked by 5 particles in the illustrated example, resent the belief distribution over the location of a tracked but in actual runs we typically use 200 particles per target. target [9], [10], [11], [12]. This part of our system is not We assume we start with particles distributed around the new, but we review the algorithm to introduce notation target to be tracked. The principal steps in the algorithm and to facilitate understanding of the new components. include: In the Bayes filtering paradigm, we recursively update 1) Score: each particle is scored according to how well the posterior distribution over the current state Xt given the underlying pixels match an appearance model, t all observations Z = {Z1..Zt} up to and including time by weighting it with P (Zt|Xt) (see next section). t, as follows: 2) Resample: the particles are “resampled” according to their score. This operation results in the same t t−1 P (Xt|Z ) = kP (Zt|Xt)P (Xt|Z ) number of particles, but very likely particles are du- Z t−1 plicated while unlikely ones are dropped. Note that = kP (Zt|Xt) P (Xt|Xt−1)P (Xt−1|Z ) Xt−1 this is equivalent to choosing a mixture component to sample from in (1). where the likelihood P (Z |X ) expresses the measurement t t 3) Average: the location of the target is estimated by model and P (Xt|Xt−1) is the motion model. In a particle t−1 computing the mean of all the associated particles. filter we approximate the posterior P (Xt−1|Z ) recur- (r) (r) N This is the estimate reported by the algorithm as the sively as a set of weighted N samples {Xt−1, πt−1}r=1, (r) (r) location of the target in the current video frame. where πt−1 is the weight for particle Xt−1. Given this, 4) Apply motion model: each particle is stochastically we can use a a Monte Carlo approximation of the integral repositioned according to a model of the target’s and get: motion. This is implemented by sampling from the t X (r) (r) chosen mixture component in (1). P (Xt|Z ) ≈ kP (Zt|Xt) πt−1P (Xt|Xt−1) (1) r III. APPEARANCE-BASED TRACKING One way to view a particle filter is as an importance (s) For completeness, we describe here the details of the sampler for this distribution. Specifically, N samples Xt appearance based measurement model we used for track- are drawn from the proposal distribution ing insects in video, given that all the results we show X (r) (r) q(X ) = π P (X |X ) in Section V are based on this. We assume that the t t−1 t t−1 measurement Z consists of a local image neighborhood r t I, formed as the superposition of a sprite with shape and then weighted by the likelihood, i.e. s on a background (this is similar to [14], but without (s) (s) occlusion). In particular, for the ant-tracking experiments π = P (Zt|X ) t t below we model the sprite as a rectangular image with This results in a weighted particle approximation three parameters: length, width, and position of rotational (s) (s) N t {Xt , πt }s=1 for the posterior P (Xt|Z ) at time t. Note center (along the length). that there are other ways to explain the particle filter (see In the following, we refer to the appearance model e.g. [13]) that also accommodate other variants, but the parameters as θf . We also model the background of the mixture proposal view above is particularly suited for our image with parameters θb. An estimate for the background application. model is obtained by averaging all the images over the The general operation of the particle filter tracker is entire video sequence. For notational convenience, let illustrated in Figure 1 for tracking ants in video. In our x = Xt in what follows. Assuming that pixel values in case, each hypothesis (or particle) is represented by a I are conditionally independent given x, we can factor P (I|x, θ) into foreground and background components: Y Y P (I|x, θ) = P (Ip|θb) P (Ip|x, θf ) p∈B(s,x) p∈F (s,x) where p is an individual pixel, and Q • p∈B(s,x) P (Ip|θb) is the likelihood factor due to the background pixels B(s, x). The set of background pixels B(s, x) depends on sprite shape s and position Fig. 2. Example Markov random field for a group of ants. Only targets x. that are close to one another have the potential for interacting in their Q motion.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us