Results 1  10
of
82
A survey of sequential Monte Carlo methods for economics and finance
, 2009
"... This paper serves as an introduction and survey for economists to the field of sequential Monte Carlo methods which are also known as particle filters. Sequential Monte Carlo methods are simulation based algorithms used to compute the highdimensional and/or complex integrals that arise regularly in ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
This paper serves as an introduction and survey for economists to the field of sequential Monte Carlo methods which are also known as particle filters. Sequential Monte Carlo methods are simulation based algorithms used to compute the highdimensional and/or complex integrals that arise regularly in applied work. These methods are becoming increasingly popular in economics and finance; from dynamic stochastic general equilibrium models in macroeconomics to option pricing. The objective of this paper is to explain the basics of the methodology, provide references to the literature, and cover some of the theoretical results that justify the methods in practice.
Bayesian nonparametric inference of switching linear dynamical systems
, 2010
"... Abstract—Many complex dynamical phenomena can be effectively modeled by a system that switches among a set of conditionally linear dynamical modes. We consider two such models: the switching linear dynamical system (SLDS) and the switching vector autoregressive (VAR) process. Our Bayesian nonparamet ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Abstract—Many complex dynamical phenomena can be effectively modeled by a system that switches among a set of conditionally linear dynamical modes. We consider two such models: the switching linear dynamical system (SLDS) and the switching vector autoregressive (VAR) process. Our Bayesian nonparametric approach utilizes a hierarchical Dirichlet process prior to learn an unknown number of persistent, smooth dynamical modes. We additionally employ automatic relevance determination to infer a sparse set of dynamic dependencies allowing us to learn SLDS with varying state dimension or switching VAR processes with varying autoregressive order. We develop a sampling algorithm that combines a truncated approximation to the Dirichlet process with efficient joint sampling of the mode and state sequences. The utility and flexibility of our model are demonstrated on synthetic data, sequences of dancing honey bees, the IBOVESPA stock index and a maneuvering target tracking application. Index Terms—Autoregressive processes, Bayesian methods, hidden Markov models, statespace methods, time series analysis,
Ancestor Sampling for Particle Gibbs
"... We present a novel method in the family of particle MCMC methods that we refer to as particle Gibbs with ancestor sampling (PGAS). Similarly to the existing PG with backward simulation (PGBS) procedure, we use backward sampling to (considerably) improve the mixing of the PG kernel. Instead of usin ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
We present a novel method in the family of particle MCMC methods that we refer to as particle Gibbs with ancestor sampling (PGAS). Similarly to the existing PG with backward simulation (PGBS) procedure, we use backward sampling to (considerably) improve the mixing of the PG kernel. Instead of using separate forward and backward sweeps as in PGBS, however, we achieve the same effect in a single forward sweep. We apply the PGAS framework to the challenging class of nonMarkovian statespace models. We develop a truncation strategy of these models that is applicable in principle to any backwardsimulationbased method, but which is particularly well suited to the PGAS framework. In particular, as we show in a simulation study, PGAS can yield an orderofmagnitude improved accuracy relative to PGBS due to its robustness to the truncation error. Several application examples are discussed, including RaoBlackwellized particle smoothing and inference in degenerate statespace models. 1
ON THE USE OF BACKWARD SIMULATION IN THE PARTICLE GIBBS SAMPLER
"... The particle Gibbs (PG) sampler was introduced in [1] as a way to incorporate a particle filter (PF) in a Markov chain Monte Carlo (MCMC) sampler. The resulting method was shown to be an efficient tool for joint Bayesian parameter and state inference in nonlinear, nonGaussian statespace models. Ho ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
The particle Gibbs (PG) sampler was introduced in [1] as a way to incorporate a particle filter (PF) in a Markov chain Monte Carlo (MCMC) sampler. The resulting method was shown to be an efficient tool for joint Bayesian parameter and state inference in nonlinear, nonGaussian statespace models. However, the mixing of the PG kernel can be very poor when there is severe degeneracy in the PF. Hence, the success of the PG sampler heavily relies on the, often unrealistic, assumption that we can implement a PF without suffering from any considerate degeneracy. However, as pointed out by Whiteley [2] in the discussion following [1], the mixing can be improved by adding a backward simulation step to the PG sampler. Here, we investigate this further, derive an explicit PG sampler with backward simulation (denoted PGBSi) and show that this indeed is a valid MCMC method. Furthermore, we show in a numerical example that backward simulation can lead to a considerable increase in performance over the standard PG sampler. Index Terms — Particle Markov chain Monte Carlo, particle filter, particle Gibbs, backward simulation, Gibbs sampling.
Efficient Bayesian Inference for Switching StateSpace Models using Particle Markov Chain Monte Carlo Methods
, 2010
"... Switching statespace models (SSSM) are a popular class of time series models that have found many applications in statistics, econometrics and advanced signal processing. Bayesian inference for these models typically relies on Markov chain Monte Carlo (MCMC) techniques. However, even sophisticated ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Switching statespace models (SSSM) are a popular class of time series models that have found many applications in statistics, econometrics and advanced signal processing. Bayesian inference for these models typically relies on Markov chain Monte Carlo (MCMC) techniques. However, even sophisticated MCMC methods dedicated to SSSM can prove quite inefficient as they update potentially strongly correlated variables oneatatime. Particle Markov chain Monte Carlo (PMCMC) methods are a recently developed class of MCMC algorithms which use particle filters to build efficient proposal distributions in highdimensions [1]. The existing PMCMC methods of [1] are applicable to SSSM, but are restricted to employing standard particle filtering techniques. Yet, in the context of SSSM, much more efficient particle techniques have been developed [22, 23, 24]. In this paper, we extend the PMCMC framework to enable the use of these efficient particle methods within MCMC. We demonstrate the resulting generic methodology on a variety of examples including a multiple changepoints model for welllog data and a model for U.S./U.K. exchange rate data. These new PMCMC algorithms are shown to outperform experimentally stateoftheart MCMC techniques for a fixed computational complexity. Additionally they can be easily parallelized [39] which allows further substantial gains.
Priors over Recurrent Continuous Time Processes
"... We introduce the GammaExponential Process (GEP), a prior over a large family of continuous time stochastic processes. A hierarchical version of this prior (HGEP; the Hierarchical GEP) yields a useful model for analyzing complex time series. Models based on HGEPs display many attractive properties: ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
We introduce the GammaExponential Process (GEP), a prior over a large family of continuous time stochastic processes. A hierarchical version of this prior (HGEP; the Hierarchical GEP) yields a useful model for analyzing complex time series. Models based on HGEPs display many attractive properties: conjugacy, exchangeability and closedform predictive distribution for the waiting times, and exact Gibbs updates for the time scale parameters. After establishing these properties, we show how posterior inference can be carried efficiently using Particle MCMC methods [1]. This yields a MCMC algorithm that can resample entire sequences atomically while avoiding the complications of introducing slice and stick auxiliary variables of the beam sampler [2]. We applied our model to the problem of estimating the disease progression in multiple sclerosis [3], and to RNA evolutionary modeling [4]. In both domains, we found that our model outperformed the standard rate matrix estimation approach. 1
An Introduction to Particle Methods with Financial Applications
 in Numerical Methods in Finance of the Springer Proceedings in Mathematics
, 2012
"... Abstract The aim of this article is to give a general introduction to the theory of interacting particle methods, and an overview of its applications to computational finance. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract The aim of this article is to give a general introduction to the theory of interacting particle methods, and an overview of its applications to computational finance.
MCMC for continuoustime discretestate systems
"... We propose a simple and novel framework for MCMC inference in continuoustime discretestate systems with pure jump trajectories. We construct an exact MCMC sampler for such systems by alternately sampling a random discretization of time given a trajectory of the system, and then a new trajectory giv ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We propose a simple and novel framework for MCMC inference in continuoustime discretestate systems with pure jump trajectories. We construct an exact MCMC sampler for such systems by alternately sampling a random discretization of time given a trajectory of the system, and then a new trajectory given the discretization. The first step can be performed efficiently using properties of the Poisson process, while the second step can avail of discretetime MCMC techniques based on the forwardbackward algorithm. We show the advantage of our approach compared to particle MCMC and a uniformizationbased sampler. 1
Modeling and
 Control of Induction Machine, Technip Edition
, 1995
"... Abstract: Sparsitypromoting priors have become increasingly popular over recent years due to an increased number of regression and classification applications involving a large number of predictors. In time series applications where observations are collected over time, it is often unrealistic to a ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract: Sparsitypromoting priors have become increasingly popular over recent years due to an increased number of regression and classification applications involving a large number of predictors. In time series applications where observations are collected over time, it is often unrealistic to assume that the underlying sparsity pattern is fixed. We propose here an original class of flexible Bayesian linear models for dynamic sparsity modelling. The proposed class of models expands upon the existing Bayesian literature on sparse regression using generalized multivariate hyperbolic distributions. The properties of the models are explored through both analytic results and simulation studies. We demonstrate the model on a financial application where it is shown that it accurately represents the patterns seen in the analysis of stock and derivative data, and is able to detect major events by filtering an artificial portfolio of assets. Keywords: generalized hyperbolic, Gaussian mixture models, sparsity, dynamic regression
A semiparametric Bayesian approach to Wiener system identification
"... Abstract: We consider a semiparametric, i.e. a mixed parametric/nonparametric, model of a Wiener system. We use a statespace model for the linear dynamical system and a nonparametric Gaussian process (GP) model for the static nonlinearity. The GP model is a flexible model that can describe differen ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract: We consider a semiparametric, i.e. a mixed parametric/nonparametric, model of a Wiener system. We use a statespace model for the linear dynamical system and a nonparametric Gaussian process (GP) model for the static nonlinearity. The GP model is a flexible model that can describe different types of nonlinearities while avoiding making strong assumptions such as monotonicity. We derive an inferential method based on recent advances in Monte Carlo statistical methods, known as Particle Markov Chain Monte Carlo (PMCMC). The idea underlying PMCMC is to use a particle filter (PF) to generate a sample state trajectory in a Markov chain Monte Carlo sampler. We use a recently proposed PMCMC sampler, denoted particle Gibbs with backward simulation, which has been shown to be efficient even when we use very few particles in the PF. The resulting method is used in a simulation study to identify two different Wiener systems with noninvertible nonlinearities. 1.