Results 1  10
of
54
A Survey of Convergence Results on Particle Filtering Methods for Practitioners
, 2002
"... Optimal filtering problems are ubiquitous in signal processing and related fields. Except for a restricted class of models, the optimal filter does not admit a closedform expression. Particle filtering methods are a set of flexible and powerful sequential Monte Carlo methods designed to solve the o ..."
Abstract

Cited by 133 (4 self)
 Add to MetaCart
Optimal filtering problems are ubiquitous in signal processing and related fields. Except for a restricted class of models, the optimal filter does not admit a closedform expression. Particle filtering methods are a set of flexible and powerful sequential Monte Carlo methods designed to solve the optimal filtering problem numerically. The posterior distribution of the state is approximated by a large set of Diracdelta masses (samples/particles) that evolve randomly in time according to the dynamics of the model and the observations. The particles are interacting; thus, classical limit theorems relying on statistically independent samples do not apply. In this paper, our aim is to present a survey of recent convergence results on this class of methods to make them accessible to practitioners.
Central limit theorem for sequential monte carlo methods and its application to bayesian inference
 Ann. Statist
"... “particle filters, ” refers to a general class of iterative algorithms that performs Monte Carlo approximations of a given sequence of distributions of interest (πt). We establish in this paper a central limit theorem for the Monte Carlo estimates produced by these computational methods. This result ..."
Abstract

Cited by 58 (2 self)
 Add to MetaCart
“particle filters, ” refers to a general class of iterative algorithms that performs Monte Carlo approximations of a given sequence of distributions of interest (πt). We establish in this paper a central limit theorem for the Monte Carlo estimates produced by these computational methods. This result holds under minimal assumptions on the distributions πt, and applies in a general framework which encompasses most of the sequential Monte Carlo methods that have been considered in the literature, including the resamplemove algorithm of Gilks and Berzuini [J. R. Stat. Soc. Ser. B Stat. Methodol. 63 (2001) 127–146] and the residual resampling scheme. The corresponding asymptotic variances provide a convenient measurement of the precision of a given particle filter. We study, in particular, in some typical examples of Bayesian applications, whether and at which rate these asymptotic variances diverge in time, in order to assess the long term reliability of the considered algorithm. 1. Introduction. Sequential Monte Carlo methods form an emerging
Recursive Monte Carlo filters: Algorithms and theoretical analysis
, 2003
"... powerful tool to perform computations in general state space models. We discuss and compare the accept–reject version with the more common sampling importance resampling version of the algorithm. In particular, we show how auxiliary variable methods and stratification can be used in the accept–rejec ..."
Abstract

Cited by 40 (0 self)
 Add to MetaCart
powerful tool to perform computations in general state space models. We discuss and compare the accept–reject version with the more common sampling importance resampling version of the algorithm. In particular, we show how auxiliary variable methods and stratification can be used in the accept–reject version, and we compare different resampling techniques. In a second part, we show laws of large numbers and a central limit theorem for these Monte Carlo filters by simple induction arguments that need only weak conditions. We also show that, under stronger conditions, the required sample size is independent of the length of the observed series. 1. State space and hidden Markov models. A general state space or hidden Markov model consists of an unobserved state sequence (Xt) and an observation sequence (Yt) with the following properties: State evolution: X0,X1,X2,... is a Markov chain with X0 ∼ a0(x)dµ(x) and XtXt−1 = xt−1 ∼ at(xt−1,x)dµ(x). Generation of observations: Conditionally on (Xt), the Yt’s are independent and Yt depends on Xt only with YtXt = xt ∼ bt(xt,y)dν(y). These models occur in a variety of applications. Linear state space models are equivalent to ARMA models (see, e.g., [16]) and have become popular Received January 2003; revised August 2004. AMS 2000 subject classifications. Primary 62M09; secondary 60G35, 60J22, 65C05. Key words and phrases. State space models, hidden Markov models, filtering and smoothing, particle filters, auxiliary variables, sampling importance resampling, central limit theorem. This is an electronic reprint of the original article published by the
A Robustification Approach to Stability and to Uniform Particle Approximation of Nonlinear Filters: The Example of PseudoMixing Signals
, 2002
"... We propose a new approach to study the stability of the optimal filter w.r.t. its initial condition, by introducing a "robust" filter, which is exponentially stable and which approximates the optimal filter uniformly in time. The "robust" filter is obtained here by truncation of the likelihood funct ..."
Abstract

Cited by 30 (3 self)
 Add to MetaCart
We propose a new approach to study the stability of the optimal filter w.r.t. its initial condition, by introducing a "robust" filter, which is exponentially stable and which approximates the optimal filter uniformly in time. The "robust" filter is obtained here by truncation of the likelihood function, and the robustification result is proved under the assumption that the Markov transition kernel satisfies a pseudomixing condition (weaker than the usual mixing condition), and that the observations are "sufficiently good". This robustification approach allows us to prove also the uniform convergence of several particle approximations to the optimal filter, in some cases of nonergodic signals.
Asymptotic stability of the Wonham filter: ergodic and nonergodic signals
 SIAM J. Control Optim
"... Abstract. Stability problem of the Wonham filter with respect to initial conditions is addressed. The case of ergodic signals is revisited in view of a gap in the classic work of H. Kunita (1971). We give new bounds for the exponential stability rates, which do not depend on the observations. In the ..."
Abstract

Cited by 25 (13 self)
 Add to MetaCart
Abstract. Stability problem of the Wonham filter with respect to initial conditions is addressed. The case of ergodic signals is revisited in view of a gap in the classic work of H. Kunita (1971). We give new bounds for the exponential stability rates, which do not depend on the observations. In the nonergodic case, the stability is implied by identifiability conditions, formulated explicitly in terms of the transition intensities matrix and the observation structure. Key words. Nonlinear filtering, stability, Wonham filter
"Shape Activity": A Continuous State HMM for Moving/Deforming Shapes with Application to Abnormal Activity Detection
"... The aim is to model "activity" performed by a group of moving and interacting objects (which can be people or cars or different rigid components of the human body) and use the models for abnormal activity detection. Previous approaches to modeling group activity include cooccurrence statistics (ind ..."
Abstract

Cited by 19 (10 self)
 Add to MetaCart
The aim is to model "activity" performed by a group of moving and interacting objects (which can be people or cars or different rigid components of the human body) and use the models for abnormal activity detection. Previous approaches to modeling group activity include cooccurrence statistics (individual and joint histograms) and Dynamic Bayesian Networks, neither of which is applicable when the number of interacting objects is large. We treat the objects as point objects (referred to as "landmarks") and propose to model their changing configuration as a moving and deforming "shape" (using Kendall's shape theory for discrete landmarks). A continuous state Hidden Markov Model (HMM) is defined for landmark shape dynamics in an activity. The configuration of landmarks at a given time forms the observation vector and the corresponding shape and the scaled Euclidean motion parameters form the hidden state vector. An abnormal activity is then defined as a change in the shape activity model, which could be slow or drastic and whose parameters are unknown. Results are shown on a real abnormal activity detection problem involving multiple moving objects.
Analyticity of entropy rate of a hidden Markov chain
 In Proc. of IEEE International Symposium on Information Theory, Adelaide, Australia, September 4September 9 2005
, 1995
"... We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for t ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters. 1
A basic convergence result for particle filtering
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 2007
"... The basic nonlinear ltering problem for dynamical systems is considered. Approximating the optimal lter estimate by particle lter methods has become perhaps the most common and useful method in recent years. Many variants of particle lters have been suggested, and there is an extensive literature o ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
The basic nonlinear ltering problem for dynamical systems is considered. Approximating the optimal lter estimate by particle lter methods has become perhaps the most common and useful method in recent years. Many variants of particle lters have been suggested, and there is an extensive literature on the theoretical aspects of the quality of the approximation. Still, a clear cut result that the approximate solution, for unbounded functions, converges to the true optimal estimate as the number of particles tends to in nity seems to be lacking. It is the purpose of this contribution to give such a basic convergence result.
Change Detection in Partially Observed Nonlinear Dynamic Systems with Unknown Change Parameters
 in American Control Conference (ACC
, 2004
"... We study the change detection problem in partially observed nonlinear dynamic systems. We assume that the change parameters are unknown and the change could be gradual (slow) or sudden (drastic). For most nonlinear systems, no finite dimensional filters exist and approximation filtering methods like ..."
Abstract

Cited by 16 (14 self)
 Add to MetaCart
We study the change detection problem in partially observed nonlinear dynamic systems. We assume that the change parameters are unknown and the change could be gradual (slow) or sudden (drastic). For most nonlinear systems, no finite dimensional filters exist and approximation filtering methods like the Particle Filter are used. Even when change parameters are unknown, drastic changes can be detected easily using the increase in tracking (output) error or the negative log of observation likelihood (OL). But slow changes usually get missed. We propose in this paper, a statistic for slow change detection which turns out to be the same as the Kerridge Inaccuracy between the posterior state distribution and the normal system prior. We show asymptotic convergence (under certain assumptions) of the bounding, modeling and particle filtering errors in its approximation using a particle filter optimal for the normal system. We also demonstrate using the bounds on the errors that our statistic works in situations where observation likelihood (OL) fails and vice versa.
Interacting and annealing particle filters: Mathematics and a recipe for applications
 J. OF MATHEMATICAL IMAGING AND VISION
, 2007
"... Interacting and annealing are two powerful strategies that are applied in different areas of stochastic modelling and data analysis. Interacting particle systems approximate a distribution of interest by a finite number of particles where the particles interact between the time steps. In computer v ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
Interacting and annealing are two powerful strategies that are applied in different areas of stochastic modelling and data analysis. Interacting particle systems approximate a distribution of interest by a finite number of particles where the particles interact between the time steps. In computer vision, they are commonly known as particle filters. Simulated annealing, on the other hand, is a global optimization method derived from statistical mechanics. A recent heuristic approach to fuse these two techniques for motion capturing has become known as annealed particle filter. In order to analyze these techniques, we rigorously derive in this paper two algorithms with annealing properties based on the mathematical theory of interacting particle systems. Convergence results and sufficient parameter restrictions enable us