Results 1  10
of
28
Sequential Monte Carlo Methods for Dynamic Systems
 Journal of the American Statistical Association
, 1998
"... A general framework for using Monte Carlo methods in dynamic systems is provided and its wide applications indicated. Under this framework, several currently available techniques are studied and generalized to accommodate more complex features. All of these methods are partial combinations of three ..."
Abstract

Cited by 474 (9 self)
 Add to MetaCart
A general framework for using Monte Carlo methods in dynamic systems is provided and its wide applications indicated. Under this framework, several currently available techniques are studied and generalized to accommodate more complex features. All of these methods are partial combinations of three ingredients: importance sampling and resampling, rejection sampling, and Markov chain iterations. We deliver a guideline on how they should be used and under what circumstance each method is most suitable. Through the analysis of differences and connections, we consolidate these methods into a generic algorithm by combining desirable features. In addition, we propose a general use of RaoBlackwellization to improve performances. Examples from econometrics and engineering are presented to demonstrate the importance of RaoBlackwellization and to compare different Monte Carlo procedures. Keywords: Blind deconvolution; Bootstrap filter; Gibbs sampling; Hidden Markov model; Kalman filter; Markov...
Architectures for Efficient Implementation of Particle Filters
, 2004
"... Particle filters are sequential Monte Carlo methods that are used in numerous problems where timevarying signals must be presented in real time and where the objective is to estimate various unknowns of the signal and/or detect events described by the signals. The standard solutions of such proble ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
Particle filters are sequential Monte Carlo methods that are used in numerous problems where timevarying signals must be presented in real time and where the objective is to estimate various unknowns of the signal and/or detect events described by the signals. The standard solutions of such problems in many applications are based on the Kalman filters or extended Kalman filters. In situations when the problems are nonlinear or the noise that distorts the signals is nonGaussian, the Kalman filters provide a solution that may be far from optimal. Particle filters are an intriguing alternative to the Kalman filters due to their excellent performance in very di#cult problems including communications, signal processing, navigation, and computer vision. Hence, particle filters have been the focus of wide research recently and immense literature can be found on their theory. Most of these works recognize the complexity and computational intensity of these filters, but there has been no e#ort directed toward the implementation of these filters in hardware. The objective of this dissertation is to develop, design, and build e#cient hardware for particle filters, and thereby bring them closer to practical applications. The fact that particle filters outperform most of the traditional filtering methods in many complex practical scenarios, coupled with the challenges related to decreasing their computational complexity and improving realtime performance, makes this work worthwhile. The main
Computational Methods for Complex Stochastic Systems: A Review of Some Alternatives to MCMC
"... We consider analysis of complex stochastic models based upon partial information. MCMC and reversible jump MCMC are often the methods of choice for such problems, but in some situations they can be difficult to implement; and suffer from problems such as poor mixing, and the difficulty of diagnosing ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
We consider analysis of complex stochastic models based upon partial information. MCMC and reversible jump MCMC are often the methods of choice for such problems, but in some situations they can be difficult to implement; and suffer from problems such as poor mixing, and the difficulty of diagnosing convergence. Here we review three alternatives to MCMC methods: importance sampling, the forwardbackward algorithm, and sequential Monte Carlo (SMC). We discuss how to design good proposal densities for importance sampling, show some of the range of models for which the forwardbackward algorithm can be applied, and show how resampling ideas from SMC can be used to improve the efficiency of the other two methods. We demonstrate these methods on a range of examples, including estimating the transition density of a diffusion and of a discretestate continuoustime Markov chain; inferring structure in population genetics; and segmenting genetic divergence data.
Computing Normalizing Constants for Finite Mixture Models via Incremental Mixture Importance Sampling (IMIS)
, 2003
"... We propose a method for approximating integrated likelihoods in finite mixture models. We formulate the model in terms of the unobserved group memberships, z, and make them the variables of integration. The integral is then evaluated using importance sampling over the z. We propose an adaptive imp ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
We propose a method for approximating integrated likelihoods in finite mixture models. We formulate the model in terms of the unobserved group memberships, z, and make them the variables of integration. The integral is then evaluated using importance sampling over the z. We propose an adaptive importance sampling function which is itself a mixture, with two types of component distributions, one concentrated and one diffuse. The more concentrated type of component serves the usual purpose of an importance sampling function, sampling mostly group assignments of high posterior probability. The less concentrated type of component allows for the importance sampling function to explore the space in a controlled way to find other, unvisited assignments with high posterior probability. Components are added adaptively, one at a time, to cover areas of high posterior probability not well covered by the current important sampling function. The method is called Incremental Mixture Importance Sampling (IMIS). IMIS is easy to implement and to monitor for convergence. It scales easily for higher dimensional
Resampling Algorithms for Particle Filters: A Computational Complexity Perspective
"... Newly developed resampling algorithms for particle filters suitable for realtime implementation are described and their analysis is presented. The new algorithms reduce the complexity of both hardware and DSP realization through addressing common issues such as decreasing the number of operations a ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Newly developed resampling algorithms for particle filters suitable for realtime implementation are described and their analysis is presented. The new algorithms reduce the complexity of both hardware and DSP realization through addressing common issues such as decreasing the number of operations and memory access. Moreover, the algorithms allow for use of higher sampling frequencies by overlapping in time the resampling step with the other particle filtering steps. Since resampling is not dependent on any particular application, the analysis is appropriate for all types of particle filters that use resampling. The performance of the algorithms is evaluated on particle filters applied to bearingsonly tracking and joint detection and estimation in wireless communications. We have demonstrated that the proposed algorithms reduce the complexity without performance degradation. Key words: particle filters, resampling, computational complexity, sequential implementation 1
Stochastic Neural Networks with Applications to Nonlinear Time Series
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2001
"... Neural networks have a burgeoning literature in nonlinear time series. We consider here a variant of the conventional neural network model, called the stochastic neural network, that can be used to approximate complex nonlinear stochastic systems. We show how the EM algorithm can be used to develop ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Neural networks have a burgeoning literature in nonlinear time series. We consider here a variant of the conventional neural network model, called the stochastic neural network, that can be used to approximate complex nonlinear stochastic systems. We show how the EM algorithm can be used to develop efficient estimation schemes that have much lower computational complexity than those for conventional neural networks. This enables us to carry out model selection procedures such as the BIC to choose the number of hidden units and the input variables for each hidden unit. On the other hand, stochastic neural networks are shown to have the universal approximation property of neural networks. Other important properties of the proposed model are also given, and modelbased multistep ahead forecasts are provided. For illustration, we fit stochastic neural network models to several real and simulated time series. Our results show that the fitted models improve postsample forecasts over conventional neural networks and other nonlinear/nonparametric models.
Efficient Bayesian analysis of multiple changepoint models with dependence across segments
, 2010
"... We consider Bayesian analysis of a class of multiple changepoint models. While there are a variety of efficient ways to analyse these models if the parameters associated with each segment are independent, there are few general approaches for models where the parameters are dependent. Under the assu ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We consider Bayesian analysis of a class of multiple changepoint models. While there are a variety of efficient ways to analyse these models if the parameters associated with each segment are independent, there are few general approaches for models where the parameters are dependent. Under the assumption that the dependence is Markov, we propose an efficient online algorithm for sampling from an approximation to the posterior distribution of the number and position of the changepoints. In a simulation study, we show that the approximation introduced is negligible. We illustrate the power of our approach through fitting piecewise polynomial models to data, under a model which allows for either continuity or discontinuity of the underlying curve at each changepoint. This method is competitive with, or outperforms, other methods for inferring curves from noisy data; and uniquely it allows for inference of the locations of discontinuities in the underlying curve.
On sequential Monte Carlo, partial rejection control and approximate Bayesian computation
, 2008
"... We present a sequential Monte Carlo sampler variant of the partial rejection control algorithm introduced by Liu (2001), termed SMC sampler PRC, and show that this variant can be considered under the same framework of the sequential Monte Carlo sampler of Del Moral et al. (2006). We make connections ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
We present a sequential Monte Carlo sampler variant of the partial rejection control algorithm introduced by Liu (2001), termed SMC sampler PRC, and show that this variant can be considered under the same framework of the sequential Monte Carlo sampler of Del Moral et al. (2006). We make connections with existing algorithms and theoretical results, and extend some theoretical results to the SMC sampler PRC setting. We examine the properties of the SMC sampler PRC and give recommendations for user specified quantities. We also study the special case of SMC sampler PRC in the “likelihood free” approximate Bayesian computation framework, as introduced by Sisson et al. (2007).
Another look at rejection sampling through importance sampling
 Statistics and Probability Letters
, 2005
"... We provide a different view of rejection sampling by putting it in the framework of importance sampling. When rejection sampling with an envelope function g is viewed as a special importance sampling algorithm, we show that it is inferior to the importance sampling algorithm with g as the proposal ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We provide a different view of rejection sampling by putting it in the framework of importance sampling. When rejection sampling with an envelope function g is viewed as a special importance sampling algorithm, we show that it is inferior to the importance sampling algorithm with g as the proposal distribution in terms of the Chisquare distance between the proposal distribution and the target distribution. Similar conclusions are drawn for comparing rejection control with importance sampling.
Bayesian Time Series: Analysis Methods Using SimulationBased Computation
, 2000
"... This dissertation introduces new simulationbased analysis approaches, including both sequential and offline learning algorithms, for various Bayesian time series models. We provide a Markov Chain Monte Carlo (MCMC) method for an autoregressive (AR) model with innovations following exponential powe ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This dissertation introduces new simulationbased analysis approaches, including both sequential and offline learning algorithms, for various Bayesian time series models. We provide a Markov Chain Monte Carlo (MCMC) method for an autoregressive (AR) model with innovations following exponential power distributions using the fact that an exponential power distribution is a scale mixture of normals. This model has application in signal processing, specifically image processing, with orthogonal wave...