Results 1  10
of
120
On Sequential Monte Carlo Sampling Methods for Bayesian Filtering
 STATISTICS AND COMPUTING
, 2000
"... In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is develop ..."
Abstract

Cited by 660 (63 self)
 Add to MetaCart
In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is developed that unifies many of the methods which have been proposed over the last few decades in several different scientific disciplines. Novel extensions to the existing methods are also proposed. We show in particular how to incorporate local linearisation methods similar to those which have previously been employed in the deterministic filtering literature; these lead to very effective importance distributions. Furthermore we describe a method which uses RaoBlackwellisation in order to take advantage of the analytic structure present in some important classes of statespace models. In a final section we develop algorithms for prediction, smoothing and evaluation of the likelihood in dynamic models.
Robust Monte Carlo Localization for Mobile Robots
, 2001
"... Mobile robot localization is the problem of determining a robot's pose from sensor data. This article presents a family of probabilistic localization algorithms known as Monte Carlo Localization (MCL). MCL algorithms represent a robot's belief by a set of weighted hypotheses (samples), which approxi ..."
Abstract

Cited by 608 (83 self)
 Add to MetaCart
Mobile robot localization is the problem of determining a robot's pose from sensor data. This article presents a family of probabilistic localization algorithms known as Monte Carlo Localization (MCL). MCL algorithms represent a robot's belief by a set of weighted hypotheses (samples), which approximate the posterior under a common Bayesian formulation of the localization problem. Building on the basic MCL algorithm, this article develops a more robust algorithm called MixtureMCL, which integrates two complimentary ways of generating samples in the estimation. To apply this algorithm to mobile robots equipped with range finders, a kernel density tree is learned that permits fast sampling. Systematic empirical results illustrate the robustness and computational efficiency of the approach.
Filtering Via Simulation: Auxiliary Particle Filters
, 1997
"... This paper analyses the recently suggested particle approach to filtering time series. We suggest that the algorithm is not robust to outliers for two reasons: the design of the simulators and the use of the discrete support to represent the sequentially updating prior distribution. Both problems ar ..."
Abstract

Cited by 519 (15 self)
 Add to MetaCart
This paper analyses the recently suggested particle approach to filtering time series. We suggest that the algorithm is not robust to outliers for two reasons: the design of the simulators and the use of the discrete support to represent the sequentially updating prior distribution. Both problems are tackled in this paper. We believe we have largely solved the first problem and have reduced the order of magnitude of the second. In addition we introduce the idea of stratification into the particle filter which allows us to perform online Bayesian calculations about the parameters which index the models and maximum likelihood estimation. The new methods are illustrated by using a stochastic volatility model and a time series model of angles. Some key words: Filtering, Markov chain Monte Carlo, Particle filter, Simulation, SIR, State space. 1 1
Rates of convergence of the Hastings and Metropolis algorithms
 ANNALS OF STATISTICS
, 1996
"... We apply recent results in Markov chain theory to Hastings and Metropolis algorithms with either independent or symmetric candidate distributions, and provide necessary and sufficient conditions for the algorithms to converge at a geometric rate to a prescribed distribution ß. In the independence ca ..."
Abstract

Cited by 163 (13 self)
 Add to MetaCart
We apply recent results in Markov chain theory to Hastings and Metropolis algorithms with either independent or symmetric candidate distributions, and provide necessary and sufficient conditions for the algorithms to converge at a geometric rate to a prescribed distribution ß. In the independence case (in IR k ) these indicate that geometric convergence essentially occurs if and only if the candidate density is bounded below by a multiple of ß; in the symmetric case (in IR only) we show geometric convergence essentially occurs if and only if ß has geometric tails. We also evaluate recently developed computable bounds on the rates of convergence in this context: examples show that these theoretical bounds can be inherently extremely conservative, although when the chain is stochastically monotone the bounds may well be effective.
An Improved Particle Filter for Nonlinear Problems
, 2004
"... The Kalman filter provides an effective solution to the linearGaussian filtering problem. However, where there is nonlinearity, either in the model specification or the observation process, other methods are required. We consider methods known generically as particle filters, which include the c ..."
Abstract

Cited by 156 (8 self)
 Add to MetaCart
The Kalman filter provides an effective solution to the linearGaussian filtering problem. However, where there is nonlinearity, either in the model specification or the observation process, other methods are required. We consider methods known generically as particle filters, which include the condensation algorithm and the Bayesian bootstrap or sampling importance resampling (SIR) filter. These filters
The Unscented Particle Filter
, 2000
"... In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance proposal distribution. This proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available information an ..."
Abstract

Cited by 144 (9 self)
 Add to MetaCart
In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance proposal distribution. This proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available information and, secondly, it can have heavy tails. As a result, we find that the algorithm outperforms standard particle filtering and other nonlinear filtering methods very substantially. This experimental finding is in agreement with the theoretical convergence proof for the algorithm. The algorithm also includes resampling and (possibly) Markov chain Monte Carlo (MCMC) steps.
Geometric Convergence and Central Limit Theorems for Multidimensional Hastings and Metropolis Algorithms
 Biometrika
, 1996
"... We develop results on geometric ergodicity of Markov chains and apply these and other recent results in Markov chain theory to multidimensional Hastings and Metropolis algorithms. For those based on random walk candidate distributions, we find sufficient conditions for moments and for moment genera ..."
Abstract

Cited by 123 (35 self)
 Add to MetaCart
We develop results on geometric ergodicity of Markov chains and apply these and other recent results in Markov chain theory to multidimensional Hastings and Metropolis algorithms. For those based on random walk candidate distributions, we find sufficient conditions for moments and for moment generating functions to converge at a geometric rate to a prescribed distribution . By phrasing the conditions in terms of the curvature of the densities we show that the results apply to all distributions with positive density of the form (x) = h(x) exp(p(x)) where h and p are polynomials on IR d and p has an appropriate "negativedefiniteness " property. From these results we further develop central limit theorems for the Metropolis algorithm. Converse results, showing nongeometric convergence rates for chains where the rejection rate is not bounded from unity, are also given; these show that the negativedefiniteness property is not redundant. Work supported in part by NSF Grant DMS920568...
Using the CONDENSATION Algorithm for Robust, Visionbased Mobile Robot Localization
, 1999
"... To navigate reliably in indoor environments, a mobile robot must know where it is. This includes both the ability of globally localizing the robot from scratch, as well as tracking the robot's position once its location is known. Vision has long been advertised as providing a solution to these probl ..."
Abstract

Cited by 115 (29 self)
 Add to MetaCart
To navigate reliably in indoor environments, a mobile robot must know where it is. This includes both the ability of globally localizing the robot from scratch, as well as tracking the robot's position once its location is known. Vision has long been advertised as providing a solution to these problems, but we still lack efficient solutions in unmodified environments. Many existing approaches require modification of the environment to function properly, and those that work within unmodified environments seldomly address the problem of global localization. In this paper we present a novel, visionbased localization method based on the CONDENSATION algorithm [17, 18], a Bayesian filtering method that uses a samplingbased density representation. We show how the CONDENSATION algorithm can be used in a novel way to track the position of the camera platform rather than tracking an object in the scene. In addition, it can also be used to globally localize the camera platform, given a visua...
Adapting the Sample Size in Particle Filters Through KLDSampling
 International Journal of Robotics Research
, 2003
"... Over the last years, particle filters have been applied with great success to a variety of state estimation problems. In this paper we present a statistical approach to increasing the efficiency of particle filters by adapting the size of sample sets during the estimation process. ..."
Abstract

Cited by 97 (8 self)
 Add to MetaCart
Over the last years, particle filters have been applied with great success to a variety of state estimation problems. In this paper we present a statistical approach to increasing the efficiency of particle filters by adapting the size of sample sets during the estimation process.
Computable bounds for geometric convergence rates of Markov chains
, 1994
"... Recent results for geometrically ergodic Markov chains show that there exist constants R ! 1; ae ! 1 such that sup jfjV j Z P n (x; dy)f(y) \Gamma Z ß(dy)f(y)j RV (x)ae n where ß is the invariant probability measure and V is any solution of the drift inequalities Z P (x; dy)V (y) V (x) ..."
Abstract

Cited by 51 (7 self)
 Add to MetaCart
Recent results for geometrically ergodic Markov chains show that there exist constants R ! 1; ae ! 1 such that sup jfjV j Z P n (x; dy)f(y) \Gamma Z ß(dy)f(y)j RV (x)ae n where ß is the invariant probability measure and V is any solution of the drift inequalities Z P (x; dy)V (y) V (x) + b1l C (x) which are known to guarantee geometric convergence for ! 1; b ! 1 and a suitable small set C. In this paper we identify for the first time computable bounds on R and ae in terms of ; b and the minorizing constants which guarantee the smallness of C. In the simplest case where C is an atom ff with P (ff; ff) ffi we can choose any ae ? # where [1 \Gamma #] \Gamma1 = 1 (1 \Gamma ) 2 h 1 \Gamma + b + b 2 + i ff (b(1 \Gamma ) + b 2 ) i and i ff i 34 \Gamma 8ffi 2 ffi 3 ji b 1 \Gamma j 2 ; and we can then choose R ae=[ae \Gamma #]. The bounds for general small sets C are similar but more complex. We apply these to simple queueing models and Markov chain Mo...