Results 1  10
of
38
MCMCbased particle filtering for tracking a variable number of interacting targets
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2005
"... We describe a particle filter that effectively deals with interacting targets targets that are influenced by the proximity and/or behavior of other targets. The particle filter includes a Markov random field (MRF) motion prior that helps maintain the identity of targets throughout an interaction, s ..."
Abstract

Cited by 204 (6 self)
 Add to MetaCart
(Show Context)
We describe a particle filter that effectively deals with interacting targets targets that are influenced by the proximity and/or behavior of other targets. The particle filter includes a Markov random field (MRF) motion prior that helps maintain the identity of targets throughout an interaction, significantly reducing tracker failures. We show that this MRF prior can be easily implemented by including an additional interaction factor in the importance weights of the particle filter. However, the computational requirements of the resulting multitarget filter render it unusable for large numbers of targets. Consequently, we replace the traditional importance sampling step in the particle filter with a novel Markov chain Monte Carlo (MCMC) sampling step to obtain a more efficient MCMCbased multitarget filter. We also show how to extend this MCMCbased filter to address a variable number of interacting targets. Finally, we present both qualitative and quantitative experimental results, demonstrating that the resulting particle filters deal efficiently and effectively with complicated target interactions.
An MCMCbased Particle Filter For Tracking Multiple Interacting Targets
 in Proc. ECCV
, 2003
"... We describe a Markov chain Monte Carlo based particle filter that effectively deals with interacting targets, i.e., targets that are influenced by the proximity and/or behavior of other targets. Such interactions cause problems for traditional approaches to the data association problem. In respon ..."
Abstract

Cited by 153 (6 self)
 Add to MetaCart
(Show Context)
We describe a Markov chain Monte Carlo based particle filter that effectively deals with interacting targets, i.e., targets that are influenced by the proximity and/or behavior of other targets. Such interactions cause problems for traditional approaches to the data association problem. In response, we developed a joint tracker that includes a more sophisticated motion model to maintain the identity of targets throughout an interaction, drastically reducing tracker failures. The paper presents two main contributions: (1) we show how a Markov random field (MRF) motion prior, built on the fly at each time step, can substantially improve tracking when targets interact, and (2) we show how this can be done efficiently using Markov chain Monte Carlo (MCMC) sampling. We prove that incorporating an MRF to model interactions is equivalent to adding an additional interaction factor to the importance weights in a joint particle filter. Since a joint particle filter suffers from exponential complexity in the number of tracked targets, we replace the traditional importance sampling step in the particle filter with an MCMC sampling step. The resulting filter deals efficiently and effectively with complicated interactions when targets approach each other. We present both qualitative and quantitative results to substantiate the claims made in the paper, including a large scale experiment on a videosequence of over 10,000 frames in length.
On Contrastive Divergence Learning
"... Maximumlikelihood (ML) learning of Markov random fields is challenging because it requires estimates of averages that have an exponential number of terms. Markov chain Monte Carlo methods typically take a long time to converge on unbiased estimates, but Hinton (2002) showed that if the Markov ..."
Abstract

Cited by 128 (15 self)
 Add to MetaCart
Maximumlikelihood (ML) learning of Markov random fields is challenging because it requires estimates of averages that have an exponential number of terms. Markov chain Monte Carlo methods typically take a long time to converge on unbiased estimates, but Hinton (2002) showed that if the Markov chain is only run for a few steps, the learning can still work well and it approximately minimizes a di#erent function called "contrastive divergence" (CD). CD learning has been successfully applied to various types of random fields. Here, we study the properties of CD learning and show that it provides biased estimates in general, but that the bias is typically very small. Fast CD learning can therefore be used to get close to an ML solution and slow ML learning can then be used to finetune the CD solution.
Tracking many objects with many sensors
 In IJCAI99
, 1999
"... Keeping track of multiple objects over time is a problem that arises in many realworld domains. The problem is often complicated by noisy sensors and unpredictable dynamics. Previous work by Huang and Russell, drawing on the data association literature, provided a probabilistic analysis and a thres ..."
Abstract

Cited by 98 (6 self)
 Add to MetaCart
Keeping track of multiple objects over time is a problem that arises in many realworld domains. The problem is often complicated by noisy sensors and unpredictable dynamics. Previous work by Huang and Russell, drawing on the data association literature, provided a probabilistic analysis and a thresholdbased approximation algorithm for the case of multiple objects detected by two spatially separated sensors. This paper analyses the case in which large numbers of sensors are involved. We show that the approach taken by Huang and Russell, who used pairwise sensorbased appearance probabilities as the elementary probabilistic model, does not scale. When more than two observations are made, the objects ' intrinsic properties must be estimated. These provide the necessary conditional independencies to allow a spatial decomposition of the global probability model. We also replace Huang and Russell's threshold algorithm for object identification with a polynomialtime approximation scheme based on Markov chain Monte Carlo simulation. Using sensor data from a freeway traffic simulation, we show that this allows accurate estimation of longrange origin/destination information even when the individual links in the sensor chain are highly unreliable. 1
Bayesian inference in the space of topological maps
 IEEE Transactions on Robotics
, 2006
"... Abstract—While probabilistic techniques have previously been investigated extensively for performing inference over the space of metric maps, no corresponding generalpurpose methods exist for topological maps. We present the concept of probabilistic topological maps (PTMs), a samplebased represent ..."
Abstract

Cited by 38 (1 self)
 Add to MetaCart
(Show Context)
Abstract—While probabilistic techniques have previously been investigated extensively for performing inference over the space of metric maps, no corresponding generalpurpose methods exist for topological maps. We present the concept of probabilistic topological maps (PTMs), a samplebased representation that approximates the posterior distribution over topologies, given available sensor measurements. We show that the space of topologies is equivalent to the intractably large space of set partitions on the set of available measurements. The combinatorial nature of the problem is overcome by computing an approximate, samplebased representation of the posterior. The PTM is obtained by performing Bayesian inference over the space of all possible topologies, and provides a systematic solution to the problem of perceptual aliasing in the domain of topological mapping. In this paper, we describe a general framework for modeling measurements, and the use of a Markovchain Monte Carlo algorithm that uses specific instances of these models for odometry and appearance measurements to estimate the posterior distribution. We present experimental results that validate our technique and generate good maps when using odometry and appearance, derived from panoramic images, as sensor measurements. Index Terms—Bayesian inference, Markovchain Monte Carlo (MCMC), mobile robots, perceptual aliasing, probability distributions, samplebased representations, topological maps. I.
Sequential Monte Carlo methods for highdimensional inverse problems: A case study for the NavierStokes equations
, 2013
"... ar ..."
Bayesian Structure From Motion
"... We formulate structure from motion as a Bayesian inference problem, and use a Markov chain Monte Carlo sampler to sample the posterior on this problem. This results in a method that can identify both small and large tracker errors, and yields reconstructions that are stable in the presence of these ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
We formulate structure from motion as a Bayesian inference problem, and use a Markov chain Monte Carlo sampler to sample the posterior on this problem. This results in a method that can identify both small and large tracker errors, and yields reconstructions that are stable in the presence of these errors. Furthermore, the method gives detailed information on the range of ambiguities in structure given a particular dataset, and requires no special geometric formulation to cope with degenerate situations. Motion segmentation is obtained by a layer of discrete variables associating a point with an object. We demonstrate a sampler that successfully samples an approximation to the marginal on this domain, producing a relatively unambiguous segmentation.
Finding People by Sampling
 Proc. Int'l Conf. on Computer Vision
, 1999
"... We show how to use a sampling method to find sparsely clad people in static images. People are modeled as an assembly of nine cylindrical segments. Segments are found using an EM algorithm, and then assembled into hypotheses incrementally, using a learned likelihood model. Each assembly step passes ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
(Show Context)
We show how to use a sampling method to find sparsely clad people in static images. People are modeled as an assembly of nine cylindrical segments. Segments are found using an EM algorithm, and then assembled into hypotheses incrementally, using a learned likelihood model. Each assembly step passes on a set of samples of its likelihood to the next; this yields effective pruning of the space of hypotheses. The collection of available ninesegment hypotheses is then represented by a set of equivalence classes, which yield an efficient pruning process. The posterior for the number of people is obtained from the class representatives. People are counted quite accurately in images of real scenes using an MAP estimate. We show the method allows topdown as well as bottom up reasoning. While the method can be overwhelmed by very large numbers of segments, we show that this problem can be avoided by quite simple pruning steps.