Results 1  10
of
195
A tutorial on particle filters for online nonlinear/nonGaussian Bayesian tracking
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 2002
"... Increasingly, for many application areas, it is becoming important to include elements of nonlinearity and nonGaussianity in order to model accurately the underlying dynamics of a physical system. Moreover, it is typically crucial to process data online as it arrives, both from the point of view o ..."
Abstract

Cited by 1703 (2 self)
 Add to MetaCart
(Show Context)
Increasingly, for many application areas, it is becoming important to include elements of nonlinearity and nonGaussianity in order to model accurately the underlying dynamics of a physical system. Moreover, it is typically crucial to process data online as it arrives, both from the point of view of storage costs as well as for rapid adaptation to changing signal characteristics. In this paper, we review both optimal and suboptimal Bayesian algorithms for nonlinear/nonGaussian tracking problems, with a focus on particle filters. Particle filters are sequential Monte Carlo methods based on point mass (or “particle”) representations of probability densities, which can be applied to any statespace model and which generalize the traditional Kalman filtering methods. Several variants of the particle filter such as SIR, ASIR, and RPF are introduced within a generic framework of the sequential importance sampling (SIS) algorithm. These are discussed and compared with the standard EKF through an illustrative example.
Dynamic Bayesian Networks: Representation, Inference and Learning
, 2002
"... Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have bee ..."
Abstract

Cited by 700 (3 self)
 Add to MetaCart
Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have been used for problems ranging from tracking planes and missiles to predicting the economy. However, HMMs
and KFMs are limited in their “expressive power”. Dynamic Bayesian Networks (DBNs) generalize HMMs by allowing the state space to be represented in factored form, instead of as a single discrete random variable. DBNs generalize KFMs by allowing arbitrary probability distributions, not just (unimodal) linearGaussian. In this thesis, I will discuss how to represent many different kinds of models as DBNs, how to perform exact and approximate inference in DBNs, and how to learn DBN models from sequential data.
In particular, the main novel technical contributions of this thesis are as follows: a way of representing
Hierarchical HMMs as DBNs, which enables inference to be done in O(T) time instead of O(T 3), where T is the length of the sequence; an exact smoothing algorithm that takes O(log T) space instead of O(T); a simple way of using the junction tree algorithm for online inference in DBNs; new complexity bounds on exact online inference in DBNs; a new deterministic approximate inference algorithm called factored frontier; an analysis of the relationship between the BK algorithm and loopy belief propagation; a way of
applying RaoBlackwellised particle filtering to DBNs in general, and the SLAM (simultaneous localization
and mapping) problem in particular; a way of extending the structural EM algorithm to DBNs; and a variety of different applications of DBNs. However, perhaps the main value of the thesis is its catholic presentation of the field of sequential data modelling.
Convergence of Sequential Monte Carlo Methods
 SEQUENTIAL MONTE CARLO METHODS IN PRACTICE
, 2000
"... Bayesian estimation problems where the posterior distribution evolves over time through the accumulation of data arise in many applications in statistics and related fields. Recently, a large number of algorithms and applications based on sequential Monte Carlo methods (also known as particle filter ..."
Abstract

Cited by 198 (14 self)
 Add to MetaCart
(Show Context)
Bayesian estimation problems where the posterior distribution evolves over time through the accumulation of data arise in many applications in statistics and related fields. Recently, a large number of algorithms and applications based on sequential Monte Carlo methods (also known as particle filtering methods) have appeared in the literature to solve this class of problems; see (Doucet, de Freitas & Gordon, 2001) for a survey. However, few of these methods have been proved to converge rigorously. The purpose of this paper is to address this issue. We present a general sequential Monte Carlo (SMC) method which includes most of the important features present in current SMC methods. This method generalizes and encompasses many recent algorithms. Under mild regularity conditions, we obtain rigorous convergence results for this general SMC method and therefore give theoretical backing for the validity of all the algorithms that can be obtained as particular cases of it.
Monte Carlo smoothing for nonlinear time series
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2004
"... We develop methods for performing smoothing computations in general statespace models. The methods rely on a particle representation of the filtering distributions, and their evolution through time using sequential importance sampling and resampling ideas. In particular, novel techniques are pr ..."
Abstract

Cited by 130 (17 self)
 Add to MetaCart
(Show Context)
We develop methods for performing smoothing computations in general statespace models. The methods rely on a particle representation of the filtering distributions, and their evolution through time using sequential importance sampling and resampling ideas. In particular, novel techniques are presented for generation of sample realizations of historical state sequences. This is carried out in a forwardfiltering backwardsmoothing procedure which can be viewed as the nonlinear, nonGaussian counterpart of standard Kalman filterbased simulation smoothers in the linear Gaussian case. Convergence in the meansquared error sense of the smoothed trajectories is proved, showing the validity of our proposed method. The methods are tested in a substantial application for the processing of speech signals represented by a timevarying autoregression and parameterised in terms of timevarying partial correlation coe#cients, comparing the results of our algorithm with those from a simple smoother based upon the filtered trajectories.
People Tracking Using Hybrid Monte Carlo Filtering
, 2001
"... Particle filters are used for hidden state estimation with nonlinear dynamical systems. The inference of 3d human motion is a natural application, given the nonlinear dynamics of the body and the nonlinear relation between states and image observations. However, the application of particle filters ..."
Abstract

Cited by 110 (6 self)
 Add to MetaCart
Particle filters are used for hidden state estimation with nonlinear dynamical systems. The inference of 3d human motion is a natural application, given the nonlinear dynamics of the body and the nonlinear relation between states and image observations. However, the application of particle filters has been limited to cases where the number of state variables is relatively small, because the number of samples needed with high dimensional problems can be prohibitive. We describe a filter that uses hybrid Monte Carlo (HMC) to obtain samples in high dimensional spaces. It uses multiple Markov chains that use posterior gradients to rapidly explore the state space, yielding fair samples from the posterior. We find that the HMC filter is several thousand times faster than a conventional particle filter on a 28D people tracking problem.
Bayesian Dynamic Factor Models and Portfolio Allocation
 Journal of Business and Economic Statistics
, 2000
"... This article is available in electronic form on the ISDS web site, http://www.stat.duke.edu ..."
Abstract

Cited by 77 (7 self)
 Add to MetaCart
(Show Context)
This article is available in electronic form on the ISDS web site, http://www.stat.duke.edu
Particle Filtering for Partially Observed Gaussian State Space Models
 J. R. Statist. Soc. B
, 2002
"... this paper, we shall concentrate on the following class of state space models.Let t 1; 2;... denote discrete time: then t x t1 B t v t F t u t ;x 0 .x 0 ;P 0 /; .1/ C t x t t " t t u t ;.2/ y t /;. 3/ where u t n u is an exogenous process and x t n x and y t n y ..."
Abstract

Cited by 63 (8 self)
 Add to MetaCart
this paper, we shall concentrate on the following class of state space models.Let t 1; 2;... denote discrete time: then t x t1 B t v t F t u t ;x 0 .x 0 ;P 0 /; .1/ C t x t t " t t u t ;.2/ y t /;. 3/ where u t n u is an exogenous process and x t n x and y t n y are unobserved processes. The sequences v t .0;I n v / n v " t .0;I n " / n " are independent identically distributed (IID) Gaussian.We assume that P 0 > 0; x 0 ;v t and w t are mutually independent for all t, and the model parameters # # . x 0 ;P 0;A t ;B t ;C t ;D t ;F t ;G t 1; 2;... / arekU wn.The processes .x t / and .y t / define a standard linear Gaussian state space model.We do not observe .y t / in our case, but .z t /.The observations .z t / are conditionally independent given the processes .x t / and .y t / and marginally distributed according to p.z t t /; it is assumed that p.z t t / can be evaluated pointwise up to a normalizing constant.Typically p.z t t / belongs to the exponential family.Alternatively z t may be a censored or quantized version of y t . This class of partially observed Gaussian state space models has numerous applications; many examples are discussed for instance in de Jong (1997), Manrique and Shephard (1998) and West and Harrison (1997). We want to estimate sequentially in time some characteristics of the posterior distribution 1:t /.Typically, we are interested in computing E.x t 1:t / (filtering), E.x t+L 1:t / (prediction) 1:t / (fixed lag smoothing), where L is a positive integer.These estimates do not in general admit analytical expressions and we must resort to numerical methods
Particle Filters for State Space Models With the Presence of Static Parameters
, 2002
"... In this paper particle filters for dynamic state space models handling unknown static parameters are discussed. The approach is based on marginalizing the static parameters out of the posterior distribution such that only the state vector needs to be considered. Such a marginalization can always be ..."
Abstract

Cited by 57 (0 self)
 Add to MetaCart
In this paper particle filters for dynamic state space models handling unknown static parameters are discussed. The approach is based on marginalizing the static parameters out of the posterior distribution such that only the state vector needs to be considered. Such a marginalization can always be applied. However, realtime applications are only possible when the distribution of the unknown parameters given both observations and the hidden state vector depends on some lowdimensional sufficient statistics. Such sufficient statistics are present in many of the commonly used state space models. Marginalizing the static parameters avoids the problem of impoverishment which typically occur when static parameters are included as part of the state vector. The filters are tested on several different models, with promising results.
Particle Methods for Change Detection, System Identification, and Control
 Proceedings of the IEEE
, 2004
"... this paper is to provide a detailed overview of them ..."
Abstract

Cited by 53 (0 self)
 Add to MetaCart
(Show Context)
this paper is to provide a detailed overview of them
Bayesian inference for nonlinear multivariate diffusion models observed with error
 Computational Statistics and Data Analysis
, 2008
"... Diffusion processes governed by stochastic differential equations (SDEs) are a well established tool for modelling continuous time data from a wide range of areas. Consequently, techniques have been developed to estimate diffusion parameters from partial and discrete observations. Likelihood based i ..."
Abstract

Cited by 45 (9 self)
 Add to MetaCart
Diffusion processes governed by stochastic differential equations (SDEs) are a well established tool for modelling continuous time data from a wide range of areas. Consequently, techniques have been developed to estimate diffusion parameters from partial and discrete observations. Likelihood based inference can be problematic as closed form transition densities are rarely available. One widely used solution involves the introduction of latent data points between every pair of observations to allow an EulerMaruyama approximation of the true transition densities to become accurate. In recent literature, Markov chain Monte Carlo (MCMC) methods have been used to sample the posterior distribution of latent data and model parameters; however, naive schemes suffer from a mixing problem that worsens with the degree of augmentation. In this paper, we explore an MCMC scheme whose performance is not adversely affected by the number of latent values. We illustrate the methodology by estimating parameters governing an autoregulatory gene network, using partial and discrete data that is subject to measurement error.