Results 1  10
of
66
Optimal filtering of jump diffusions: extracting latent states from asset prices
, 2007
"... This paper provides a methodology for computing optimal filtering distributions in discretely observed continuoustime jumpdiffusion models. Although it has received little attention, the filtering distribution is useful for estimating latent states, forecasting volatility and returns, computing mo ..."
Abstract

Cited by 44 (8 self)
 Add to MetaCart
This paper provides a methodology for computing optimal filtering distributions in discretely observed continuoustime jumpdiffusion models. Although it has received little attention, the filtering distribution is useful for estimating latent states, forecasting volatility and returns, computing model diagnostics such as likelihood ratios, and parameter estimation. Our approach combines timediscretization schemes with Monte Carlo methods to compute the optimal filtering distribution. Our approach is very general, applying in multivariate jumpdiffusion models with nonlinear characteristics and even nonanalytic observation equations, such as those that arise when option prices are available. We provide a detailed analysis of the performance of the filter, and analyze four applications: disentangling jumps from stochastic volatility, forecasting realized volatility, likelihood based model comparison, and filtering using both option prices and underlying returns.
Variational inference for diffusion processes
 In
, 2008
"... Diffusion processes are a family of continuoustime continuousstate stochastic processes that are in general only partially observed. The joint estimation of the forcing parameters and the system noise (volatility) in these dynamical systems is a crucial, but nontrivial task, especially when the s ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
Diffusion processes are a family of continuoustime continuousstate stochastic processes that are in general only partially observed. The joint estimation of the forcing parameters and the system noise (volatility) in these dynamical systems is a crucial, but nontrivial task, especially when the system is nonlinear and multimodal. We propose a variational treatment of diffusion processes, which allows us to compute type II maximum likelihood estimates of the parameters by simple gradient techniques and which is computationally less demanding than most MCMC approaches. We also show how a cheap estimate of the posterior over the parameters can be constructed based on the variational free energy. 1
Importance sampling techniques for estimation of diffusion models
 In
, 2012
"... This article develops a class of Monte Carlo (MC) methods for simulating conditioned diffusion sample paths, with special emphasis on importance sampling schemes. We restrict attention to a particular type of conditioned diffusions, the socalled diffusion bridge processes. The diffusion bridge is t ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
This article develops a class of Monte Carlo (MC) methods for simulating conditioned diffusion sample paths, with special emphasis on importance sampling schemes. We restrict attention to a particular type of conditioned diffusions, the socalled diffusion bridge processes. The diffusion bridge is the process obtained by conditioning a diffusion to start and finish at specific values at two consecutive times t0 < t1. Diffusion bridge simulation is a highly nontrivial problem. At an even more elementary level unconditional simulation of diffusions, that is without fixing the value of the process at t1, is difficult. This is a simulation from the transition distribution of the diffusion which is typically intractable. This intractability stems from the implicit specification of the diffusion as a solution of a stochastic differential equation (SDE). Although the unconditional simulation can be carried out by various approximate schemes based on discretizations of the SDE, it is not feasible to devise similar schemes for diffusion bridges in general. This has motivated active research in the last 15 years or so for the development of MC methodology for diffusion bridges. The research in this direction has been fuelled by the fundamental role that diffusion bridge simulation plays in the statistical inference for diffusion processes. Any statistical analysis which requires the transition density of the process is halted whenever the latter is not explicitly available, which is typically the case. Hence it is challenging to fit diffusion models employed in applications to the incomplete data typically available. An interesting possibility is to approximate the intractable transition density using an appropriate MC scheme and carry
Inference for stochastic volatility models using time change transformations
, 2007
"... transformations ..."
(Show Context)
Particle Gibbs with Ancestor Sampling
"... Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a new PMCMC algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
(Show Context)
Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a new PMCMC algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS). PGAS provides the data analyst with an offtheshelf class of Markov kernels that can be used to simulate, for instance, the typically highdimensional and highly autocorrelated state trajectory in a statespace model. The ancestor sampling procedure enables fast mixing of the PGAS kernel even when using seemingly few particles in the underlying SMC sampler. This is important as it can significantly reduce the computational burden that is typically associated with using SMC. PGAS is conceptually similar to the existing PG with backward simulation (PGBS) procedure. Instead of using separate forward and backward sweeps as in PGBS, however, we achieve the same effect in a single forward sweep. This makes PGAS well suited for addressing inference problems not only in statespace models, but also in models with more complex dependencies, such as nonMarkovian, Bayesian nonparametric, and general probabilistic graphical models.
Bayesian Inference for Irreducible Diffusion Processes Using the PseudoMarginal Approach
, 2010
"... In this article we examine two relatively new MCMC methods which allow for Bayesian inference in diffusion models. First, the Monte Carlo within Metropolis (MCWM) algorithm (O’Neil, Balding, Becker, Serola and Mollison, 2000) uses an importance sampling approximation for the likelihood and yields a ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
In this article we examine two relatively new MCMC methods which allow for Bayesian inference in diffusion models. First, the Monte Carlo within Metropolis (MCWM) algorithm (O’Neil, Balding, Becker, Serola and Mollison, 2000) uses an importance sampling approximation for the likelihood and yields a limiting stationary distribution that can be made arbitrarily “close ” to the posterior distribution (MCWM is not a standard MetropolisHastings algorithm, however). The second method, described in Beaumont (2003) and generalized in Andrieu and Roberts (2009), introduces auxiliary variables and utilizes a standard MetropolisHastings algorithm on the enlarged space; this method preserves the original posterior distribution. When applied to diffusion models, this approach can be viewed as a generalization of the popular data augmentation schemes that sample jointly from the missing paths and the parameters of the diffusion volatility. We show that increasing the number of auxiliary variables dramatically increases the acceptance rates in the MCMC algorithm (compared to basic data augmentation schemes), allowing for rapid convergence and mixing. The efficacy of these methods is demonstrated in a simulation study of the CoxIngersollRoss (CIR) model and an analysis of a realworld dataset.
Bayesian statespace modelling on highperformance hardware using LibBi
, 2013
"... LibBi is a software package for statespace modelling and Bayesian inference on modern computer hardware, including multicore central processing units (CPUs), manycore graphics processing units (GPUs) and distributedmemory clusters of such devices. The software parses a domainspecific language f ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
LibBi is a software package for statespace modelling and Bayesian inference on modern computer hardware, including multicore central processing units (CPUs), manycore graphics processing units (GPUs) and distributedmemory clusters of such devices. The software parses a domainspecific language for model specification, then optimises, generates, compiles and runs code for the given model, inference method and hardware platform. In presenting the software, this work serves as an introduction to statespace models and the specialised methods developed for Bayesian inference with them. The focus is on sequential Monte Carlo (SMC) methods such as the particle filter for state estimation, and the particle Markov chain Monte Carlo (PMCMC) and SMC2 methods for parameter estimation. All are wellsuited to current computer hardware. Two examples are given and developed throughout, one a linear threeelement windkessel model of the human arterial system, the other a nonlinear Lorenz ’96 model. These are specified in the prescribed modelling language, and LibBi demonstrated by performing inference with them. Empirical results are presented, including a performance comparison of the software with different hardware configurations.
Data augmentation for diffusions
, 2013
"... The problem of formal likelihoodbased (either classical or Bayesian) inference for discretely observed multidimensional diffusions is particularly challenging. In principle this involves dataaugmentation of the observation data to give representations of the entire diffusion trajectory. Most curr ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
The problem of formal likelihoodbased (either classical or Bayesian) inference for discretely observed multidimensional diffusions is particularly challenging. In principle this involves dataaugmentation of the observation data to give representations of the entire diffusion trajectory. Most currently proposed methodology splits broadly into two classes: either through the discretisation of idealised approaches for the continuoustime diffusion setup; or through the use of standard finitedimensional methodologies discretisation of the diffusion model. The connections between these approaches have not been wellstudied. This paper will provide aunifiedframeworkbringingtogethertheseapproaches, demonstratingconnections, and in some cases surprising differences. As a result, we provide, for the first time, theoretical justification for the various methods of imputing missing data. The inference problems are particularly challenging for reducible diffusions, and our framework is correspondingly more complex in that case. Therefore we treat the reducible and irreducible cases differently within the paper. Supplementary materials for the
CaliBayes and BASIS: integrated tools for the calibration, simulation and storage of biological simulation models
"... Dynamic simulation modelling of complex biological processes forms the backbone of systems biology. Discrete stochastic models are particularly appropriate for describing subcellular molecular interactions, especially when critical molecular species are thought to be present at low copynumbers. Fo ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Dynamic simulation modelling of complex biological processes forms the backbone of systems biology. Discrete stochastic models are particularly appropriate for describing subcellular molecular interactions, especially when critical molecular species are thought to be present at low copynumbers. For example, these stochastic effects play an important role in models of human ageing, where ageing results from the longterm accumulation of random damage at various biological scales. Unfortunately, realistic stochastic simulation of discrete biological processes is highly computationally intensive, requiring specialist hardware, and can benefit greatly from parallel and distributed approaches to computation and analysis. For these reasons, we have developed the BASIS system for the simulation and storage of stochastic SBML models together with associated simulation results. This system is exposed as a set of web services to allow users to incorporate its simulation tools into their workflows. Parameter inference for stochastic models is also difficult and computationally expensive. The CaliBayes system provides a set of web services (together with an R package for consuming these and formatting data) which addresses this problem for SBML models. It uses a sequential Bayesian MCMC method which is powerful and flexible, providing very rich information. However this approach is exceptionally computationally intensive and requires the use of a carefully designed architecture. Again, these tools are exposed as web services to allow users to take advantage of this system. In this paper we describe these two systems and demonstrate their integrated use with an example workflow to estimate the parameters of a simple model of S. Cerevisiae growth on agar plates.
A Summary of
 Findings of the WestCentral Florida Coastal Studies Project. USGC Open File Report
, 2001
"... It is known that most cases of idiopathic torsion dystonia (ITD) are inherited in an autosomal dominant fashion. Despite clarification of the underlying genetic defect, no consistent structural lesion has been identified in ITD, and it is probable that a biochemical disturbance is the basis of the d ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
It is known that most cases of idiopathic torsion dystonia (ITD) are inherited in an autosomal dominant fashion. Despite clarification of the underlying genetic defect, no consistent structural lesion has been identified in ITD, and it is probable that a biochemical disturbance is the basis of the disorder. To determine whether there is impaired function of the nigrostriatal dopaminergic terminals in ITD we studied 11 subjects with generalized ITD and a positive family history using [18F]dopa and PET scanning. Of these 11 patients, eight had putamen [18F]dopa uptake within the lower half of the normal range, while three had uptake reduced by>2 SDs below the normal mean. The lowest putamen [18F]dopa influx constants were found in the most disabled patients. As these reductions were mild it is unlikely that abnormalities of the nigrostriatal dopaminergic pathway are the primary determinant of either the nature or the severity of dystonic symptoms. In addition, we studied three presumed carriers of the ITD gene. These subjects all had normal striatal [18F]dopa influx constants suggesting that [18F]dopa PET is unsuitable as a screening tool for ITD.