Results 1  10
of
22
Sequential Monte Carlo Samplers
, 2002
"... In this paper, we propose a general algorithm to sample sequentially from a sequence of probability distributions known up to a normalizing constant and de ned on a common space. A sequence of increasingly large arti cial joint distributions is built; each of these distributions admits a marginal ..."
Abstract

Cited by 311 (48 self)
 Add to MetaCart
In this paper, we propose a general algorithm to sample sequentially from a sequence of probability distributions known up to a normalizing constant and de ned on a common space. A sequence of increasingly large arti cial joint distributions is built; each of these distributions admits a marginal which is a distribution of interest. To sample from these distributions, we use sequential Monte Carlo methods. We show that these methods can be interpreted as interacting particle approximations of a nonlinear FeynmanKac ow in distribution space. One interpretation of the FeynmanKac ow corresponds to a nonlinear Markov kernel admitting a speci ed invariant distribution and is a natural nonlinear extension of the standard MetropolisHastings algorithm. Many theoretical results have already been established for such ows and their particle approximations. We demonstrate the use of these algorithms through simulation.
A survey of sequential Monte Carlo methods for economics and finance
, 2009
"... This paper serves as an introduction and survey for economists to the field of sequential Monte Carlo methods which are also known as particle filters. Sequential Monte Carlo methods are simulation based algorithms used to compute the highdimensional and/or complex integrals that arise regularly in ..."
Abstract

Cited by 34 (7 self)
 Add to MetaCart
This paper serves as an introduction and survey for economists to the field of sequential Monte Carlo methods which are also known as particle filters. Sequential Monte Carlo methods are simulation based algorithms used to compute the highdimensional and/or complex integrals that arise regularly in applied work. These methods are becoming increasingly popular in economics and finance; from dynamic stochastic general equilibrium models in macroeconomics to option pricing. The objective of this paper is to explain the basics of the methodology, provide references to the literature, and cover some of the theoretical results that justify the methods in practice.
Computational Methods for Complex Stochastic Systems: A Review of Some Alternatives to MCMC
"... We consider analysis of complex stochastic models based upon partial information. MCMC and reversible jump MCMC are often the methods of choice for such problems, but in some situations they can be difficult to implement; and suffer from problems such as poor mixing, and the difficulty of diagnosing ..."
Abstract

Cited by 34 (5 self)
 Add to MetaCart
We consider analysis of complex stochastic models based upon partial information. MCMC and reversible jump MCMC are often the methods of choice for such problems, but in some situations they can be difficult to implement; and suffer from problems such as poor mixing, and the difficulty of diagnosing convergence. Here we review three alternatives to MCMC methods: importance sampling, the forwardbackward algorithm, and sequential Monte Carlo (SMC). We discuss how to design good proposal densities for importance sampling, show some of the range of models for which the forwardbackward algorithm can be applied, and show how resampling ideas from SMC can be used to improve the efficiency of the other two methods. We demonstrate these methods on a range of examples, including estimating the transition density of a diffusion and of a discretestate continuoustime Markov chain; inferring structure in population genetics; and segmenting genetic divergence data.
Convergences of adaptive mixtures of importance sampling schemes. The Annals of Statistics
, 2007
"... ar ..."
(Show Context)
Iterated Filtering
, 2011
"... Inference for partially observed Markov process models has been a longstanding methodological challenge with many scientific and engineering applications. Iterated filtering algorithms maximize the likelihood function for partially observed Markov process models by solving a recursive sequence of fi ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
(Show Context)
Inference for partially observed Markov process models has been a longstanding methodological challenge with many scientific and engineering applications. Iterated filtering algorithms maximize the likelihood function for partially observed Markov process models by solving a recursive sequence of filtering problems. We present new theoretical results pertaining to the convergence of iterated filtering algorithms implemented via sequential Monte Carlo filters. This theory complements the growing body of empirical evidence that iterated filtering algorithms provide an effective inference strategy for scientific models of nonlinear dynamic systems. The first step in our theory involves studying a new recursive approach for maximizing the likelihood function of a latent variable model, when this likelihood is evaluated via importance sampling. This leads to the consideration of an iterated importance sampling algorithm which serves as a simple special case of iterated filtering, and may have applicability in its own right. 1
Computational advances for and from Bayesian analysis
 STATIST. SCI
, 2004
"... The emergence in the past years of Bayesian analysis in many methodological and applied fields as the solution to the modeling of complex problems cannot be dissociated from major changes in its computational implementation. We show in this review how the advances in Bayesian analysis and statistic ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
The emergence in the past years of Bayesian analysis in many methodological and applied fields as the solution to the modeling of complex problems cannot be dissociated from major changes in its computational implementation. We show in this review how the advances in Bayesian analysis and statistical computation are intermingled.
Interacting Multiple Try Algorithms with Different Proposal Distributions
, 2010
"... ar ..."
(Show Context)
Online data processing: Comparison of Bayesian regularized particle filters
 Electronic Journal of Statistics
, 2009
"... The aim of this paper is to compare three regularized particle filters in an online data processing context. We carry out the comparison in terms of hidden states filtering and parameters estimation, considering a Bayesian paradigm and a univariate stochastic volatility model. We discuss the use of ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
The aim of this paper is to compare three regularized particle filters in an online data processing context. We carry out the comparison in terms of hidden states filtering and parameters estimation, considering a Bayesian paradigm and a univariate stochastic volatility model. We discuss the use of an improper prior distribution in the initialization of the filtering procedure and show that the regularized Auxiliary Particle Filter (APF) outperforms the regularized Sequential Importance Sampling (SIS) and the regularized Sampling
Diagnostics of priordata agreement in applied Bayesian analysis
 J. Appl. Statist
, 2008
"... Summary. This article focused on the definition and the study of a binary Bayesian criterion which measures a statistical agreement between a subjective prior and data information. The setting of this work is concrete Bayesian studies. It is an alternative and a complementary tool to the method rece ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Summary. This article focused on the definition and the study of a binary Bayesian criterion which measures a statistical agreement between a subjective prior and data information. The setting of this work is concrete Bayesian studies. It is an alternative and a complementary tool to the method recently proposed by Evans and Moshonov (2006). Both methods try to help the work of the Bayesian analyst in preliminary to the posterior computation. Our criterion is defined as a ratio of KullbackLeibler divergences; two of its main features are to make easy the check of a hierarchical prior and be used as a default calibration tool to obtain flat but proper priors in applications. Discrete and continuous distributions exemplify the approach and an industrial casestudy in reliability, involving the Weibull distribution, is highlighted.
Convergence
"... of adaptive mixtures of importance sampling schemes ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
of adaptive mixtures of importance sampling schemes