Results 1  10
of
83
A Survey of Convergence Results on Particle Filtering Methods for Practitioners
, 2002
"... Optimal filtering problems are ubiquitous in signal processing and related fields. Except for a restricted class of models, the optimal filter does not admit a closedform expression. Particle filtering methods are a set of flexible and powerful sequential Monte Carlo methods designed to solve the o ..."
Abstract

Cited by 239 (9 self)
 Add to MetaCart
Optimal filtering problems are ubiquitous in signal processing and related fields. Except for a restricted class of models, the optimal filter does not admit a closedform expression. Particle filtering methods are a set of flexible and powerful sequential Monte Carlo methods designed to solve the optimal filtering problem numerically. The posterior distribution of the state is approximated by a large set of Diracdelta masses (samples/particles) that evolve randomly in time according to the dynamics of the model and the observations. The particles are interacting; thus, classical limit theorems relying on statistically independent samples do not apply. In this paper, our aim is to present a survey of recent convergence results on this class of methods to make them accessible to practitioners.
Central limit theorem for sequential monte carlo methods and its application to bayesian inference
 Ann. Statist
"... “particle filters, ” refers to a general class of iterative algorithms that performs Monte Carlo approximations of a given sequence of distributions of interest (πt). We establish in this paper a central limit theorem for the Monte Carlo estimates produced by these computational methods. This result ..."
Abstract

Cited by 146 (4 self)
 Add to MetaCart
“particle filters, ” refers to a general class of iterative algorithms that performs Monte Carlo approximations of a given sequence of distributions of interest (πt). We establish in this paper a central limit theorem for the Monte Carlo estimates produced by these computational methods. This result holds under minimal assumptions on the distributions πt, and applies in a general framework which encompasses most of the sequential Monte Carlo methods that have been considered in the literature, including the resamplemove algorithm of Gilks and Berzuini [J. R. Stat. Soc. Ser. B Stat. Methodol. 63 (2001) 127–146] and the residual resampling scheme. The corresponding asymptotic variances provide a convenient measurement of the precision of a given particle filter. We study, in particular, in some typical examples of Bayesian applications, whether and at which rate these asymptotic variances diverge in time, in order to assess the long term reliability of the considered algorithm. 1. Introduction. Sequential Monte Carlo methods form an emerging
Recursive Monte Carlo filters: Algorithms and theoretical analysis
, 2003
"... powerful tool to perform computations in general state space models. We discuss and compare the accept–reject version with the more common sampling importance resampling version of the algorithm. In particular, we show how auxiliary variable methods and stratification can be used in the accept–rejec ..."
Abstract

Cited by 83 (0 self)
 Add to MetaCart
(Show Context)
powerful tool to perform computations in general state space models. We discuss and compare the accept–reject version with the more common sampling importance resampling version of the algorithm. In particular, we show how auxiliary variable methods and stratification can be used in the accept–reject version, and we compare different resampling techniques. In a second part, we show laws of large numbers and a central limit theorem for these Monte Carlo filters by simple induction arguments that need only weak conditions. We also show that, under stronger conditions, the required sample size is independent of the length of the observed series. 1. State space and hidden Markov models. A general state space or hidden Markov model consists of an unobserved state sequence (Xt) and an observation sequence (Yt) with the following properties: State evolution: X0,X1,X2,... is a Markov chain with X0 ∼ a0(x)dµ(x) and XtXt−1 = xt−1 ∼ at(xt−1,x)dµ(x). Generation of observations: Conditionally on (Xt), the Yt’s are independent and Yt depends on Xt only with YtXt = xt ∼ bt(xt,y)dν(y). These models occur in a variety of applications. Linear state space models are equivalent to ARMA models (see, e.g., [16]) and have become popular Received January 2003; revised August 2004. AMS 2000 subject classifications. Primary 62M09; secondary 60G35, 60J22, 65C05. Key words and phrases. State space models, hidden Markov models, filtering and smoothing, particle filters, auxiliary variables, sampling importance resampling, central limit theorem. This is an electronic reprint of the original article published by the
A Robustification Approach to Stability and to Uniform Particle Approximation of Nonlinear Filters: The Example of PseudoMixing Signals
, 2002
"... We propose a new approach to study the stability of the optimal filter w.r.t. its initial condition, by introducing a "robust" filter, which is exponentially stable and which approximates the optimal filter uniformly in time. The "robust" filter is obtained here by truncation of ..."
Abstract

Cited by 37 (3 self)
 Add to MetaCart
We propose a new approach to study the stability of the optimal filter w.r.t. its initial condition, by introducing a "robust" filter, which is exponentially stable and which approximates the optimal filter uniformly in time. The "robust" filter is obtained here by truncation of the likelihood function, and the robustification result is proved under the assumption that the Markov transition kernel satisfies a pseudomixing condition (weaker than the usual mixing condition), and that the observations are "sufficiently good". This robustification approach allows us to prove also the uniform convergence of several particle approximations to the optimal filter, in some cases of nonergodic signals.
A survey of sequential Monte Carlo methods for economics and finance
, 2009
"... This paper serves as an introduction and survey for economists to the field of sequential Monte Carlo methods which are also known as particle filters. Sequential Monte Carlo methods are simulation based algorithms used to compute the highdimensional and/or complex integrals that arise regularly in ..."
Abstract

Cited by 34 (7 self)
 Add to MetaCart
This paper serves as an introduction and survey for economists to the field of sequential Monte Carlo methods which are also known as particle filters. Sequential Monte Carlo methods are simulation based algorithms used to compute the highdimensional and/or complex integrals that arise regularly in applied work. These methods are becoming increasingly popular in economics and finance; from dynamic stochastic general equilibrium models in macroeconomics to option pricing. The objective of this paper is to explain the basics of the methodology, provide references to the literature, and cover some of the theoretical results that justify the methods in practice.
Asymptotic stability of the Wonham filter: ergodic and nonergodic signals
 SIAM J. Control Optim
"... Abstract. Stability problem of the Wonham filter with respect to initial conditions is addressed. The case of ergodic signals is revisited in view of a gap in the classic work of H. Kunita (1971). We give new bounds for the exponential stability rates, which do not depend on the observations. In the ..."
Abstract

Cited by 33 (15 self)
 Add to MetaCart
(Show Context)
Abstract. Stability problem of the Wonham filter with respect to initial conditions is addressed. The case of ergodic signals is revisited in view of a gap in the classic work of H. Kunita (1971). We give new bounds for the exponential stability rates, which do not depend on the observations. In the nonergodic case, the stability is implied by identifiability conditions, formulated explicitly in terms of the transition intensities matrix and the observation structure. Key words. Nonlinear filtering, stability, Wonham filter
A basic convergence result for particle filtering
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 2007
"... The basic nonlinear ltering problem for dynamical systems is considered. Approximating the optimal lter estimate by particle lter methods has become perhaps the most common and useful method in recent years. Many variants of particle lters have been suggested, and there is an extensive literature o ..."
Abstract

Cited by 29 (8 self)
 Add to MetaCart
(Show Context)
The basic nonlinear ltering problem for dynamical systems is considered. Approximating the optimal lter estimate by particle lter methods has become perhaps the most common and useful method in recent years. Many variants of particle lters have been suggested, and there is an extensive literature on the theoretical aspects of the quality of the approximation. Still, a clear cut result that the approximate solution, for unbounded functions, converges to the true optimal estimate as the number of particles tends to in nity seems to be lacking. It is the purpose of this contribution to give such a basic convergence result.
Analyticity of entropy rate of a hidden Markov chain
 In Proc. of IEEE International Symposium on Information Theory, Adelaide, Australia, September 4September 9 2005
, 1995
"... We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for t ..."
Abstract

Cited by 29 (12 self)
 Add to MetaCart
(Show Context)
We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters. 1
Uniform time average consistency of Monte Carlo particle filters
 BP 101  54602 VillerslèsNancy Cedex Centre de recherche INRIA Paris – Rocquencourt : Domaine de Voluceau  Rocquencourt  BP 105  78153 Le Chesnay Cedex Centre de recherche INRIA Rennes – Bretagne Atlantique : IRISA, Campus universitaire de Beaulieu 
, 2009
"... Abstract. We prove that bootstrap type Monte Carlo particle filters approximate the optimal nonlinear filter in a time average sense uniformly with respect to the time horizon when the signal is ergodic and the particle system satisfies a tightness property. The latter is satisfied without further a ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We prove that bootstrap type Monte Carlo particle filters approximate the optimal nonlinear filter in a time average sense uniformly with respect to the time horizon when the signal is ergodic and the particle system satisfies a tightness property. The latter is satisfied without further assumptions when the signal state space is compact, as well as in the noncompact setting when the signal is geometrically ergodic and the observations satisfy additional regularity assumptions. 1.
"Shape Activity": A Continuous State HMM for Moving/Deforming Shapes with Application to Abnormal Activity Detection
"... The aim is to model "activity" performed by a group of moving and interacting objects (which can be people or cars or different rigid components of the human body) and use the models for abnormal activity detection. Previous approaches to modeling group activity include cooccurrence stati ..."
Abstract

Cited by 24 (11 self)
 Add to MetaCart
(Show Context)
The aim is to model "activity" performed by a group of moving and interacting objects (which can be people or cars or different rigid components of the human body) and use the models for abnormal activity detection. Previous approaches to modeling group activity include cooccurrence statistics (individual and joint histograms) and Dynamic Bayesian Networks, neither of which is applicable when the number of interacting objects is large. We treat the objects as point objects (referred to as "landmarks") and propose to model their changing configuration as a moving and deforming "shape" (using Kendall's shape theory for discrete landmarks). A continuous state Hidden Markov Model (HMM) is defined for landmark shape dynamics in an activity. The configuration of landmarks at a given time forms the observation vector and the corresponding shape and the scaled Euclidean motion parameters form the hidden state vector. An abnormal activity is then defined as a change in the shape activity model, which could be slow or drastic and whose parameters are unknown. Results are shown on a real abnormal activity detection problem involving multiple moving objects.