Results 1  10
of
56
On Leverage in a Stochastic Volatility Model
 JOURNAL OF ECONOMETRICS
, 2005
"... This note is concerned with specification for modelling financial leverage effect in the context of stochastic volatility (SV) models. Two alternative specifications coexist in the literature. One is the Euler approximation to the well known continuous time SV model with leverage effect and the o ..."
Abstract

Cited by 39 (7 self)
 Add to MetaCart
This note is concerned with specification for modelling financial leverage effect in the context of stochastic volatility (SV) models. Two alternative specifications coexist in the literature. One is the Euler approximation to the well known continuous time SV model with leverage effect and the other is the discrete time SV model of Jacquier, Polson and Rossi (2004, Journal of Econometrics, forthcoming). Using a Gaussian nonlinear state space form with uncorrelated measurement and transition errors, I show that it is easy to interpret the leverage e#ect in the conventional model whereas it is not clear how to obtain the leverage effect in the model of Jacquier et al. Empirical comparisons of these two models via Bayesian Markov chain Monte Carlo (MCMC) methods reveal that the specification of Jacquier et al is inferior. Simulation experiments are conducted to study the sampling properties of the Bayes MCMC for the conventional model.
Estimating the integrated likelihood via posterior simulation using the harmonic mean identity
 Bayesian Statistics
, 2007
"... The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison a ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison and Bayesian testing is a ratio of integrated likelihoods, and the model weights in Bayesian model averaging are proportional to the integrated likelihoods. We consider the estimation of the integrated likelihood from posterior simulation output, aiming at a generic method that uses only the likelihoods from the posterior simulation iterations. The key is the harmonic mean identity, which says that the reciprocal of the integrated likelihood is equal to the posterior harmonic mean of the likelihood. The simplest estimator based on the identity is thus the harmonic mean of the likelihoods. While this is an unbiased and simulationconsistent estimator, its reciprocal can have infinite variance and so it is unstable in general. We describe two methods for stabilizing the harmonic mean estimator. In the first one, the parameter space is reduced in such a way that the modified estimator involves a harmonic mean of heaviertailed densities, thus resulting in a finite variance estimator. The resulting
Multilevel models with multivariate mixed response types.” Statistical Modelling
, 2009
"... Abstract: We build upon the existing literature to formulate a class of models for multivariate mixtures of Gaussian, ordered or unordered categorical responses and continuous distributions that are not Gaussian, each of which can be defined at any level of a multilevel data hierarchy. We describe a ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Abstract: We build upon the existing literature to formulate a class of models for multivariate mixtures of Gaussian, ordered or unordered categorical responses and continuous distributions that are not Gaussian, each of which can be defined at any level of a multilevel data hierarchy. We describe a Markov chain Monte Carlo algorithm for fitting such models. We show how this unifies a number of disparate problems, including partially observed data and missing data in generalized linear modelling. The twolevel model is considered in detail with worked examples of applications to a prediction problem and to multiple imputation for missing data. We conclude with a discussion outlining possible extensions and connections in the literature. Software for estimating the models is freely available. Key words: Box–Cox transformation; data augmentation; data coarsening; latent Gaussian model; maximum indicant model; MCMC; missing data; mixed response models; multilevel; multiple imputation; multivariate; normalising transformations; partially known values; prediction; priorinformed imputation; probit model
Spatial extremes of wildfire sizes: Bayesian hierarchical models for extremes. Environmental and Ecological Statistics 17
, 2010
"... due to the combination of climatological and ecological factors, large wild fires are a constant treat and due to their economical impact, a big policy issue. In order to organize efficient fire fighting capacity and resource management, correct quantification of large wildfires are needed. In this ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
due to the combination of climatological and ecological factors, large wild fires are a constant treat and due to their economical impact, a big policy issue. In order to organize efficient fire fighting capacity and resource management, correct quantification of large wildfires are needed. In this paper, we quantify the regional risk of large land fire sizes, by fitting Generalized Pareto distribution to excesses over a suitably chosen high threshold. Spatiotemporal variations are introduced into the model through model parameters with suitably chosen link functions. The inference on these models will be carried using Bayesian Hierarchical Models and Markov chain Monte Carlo methods.
Modelling the effects of air pollution on health using Bayesian Dynamic Generalized Linear Models. Environmetrics 2008
"... The relationship between shortterm exposure to air pollution and mortality or morbidity has been the subject of much recent research, in which the standard method of analysis uses Poisson linear or additive models. In this paper we use a Bayesian dynamic generalised linear model (DGLM) to estimate ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
The relationship between shortterm exposure to air pollution and mortality or morbidity has been the subject of much recent research, in which the standard method of analysis uses Poisson linear or additive models. In this paper we use a Bayesian dynamic generalised linear model (DGLM) to estimate this relationship, which allows the standard linear or additive model to be extended in two ways: (i) the longterm trend and temporal correlation present in the health data can be modelled by an autoregressive process rather than a smooth function of calendar time; (ii) the effects of air pollution are allowed to evolve over time. The efficacy of these two extensions are investigated by applying a series of dynamic and nondynamic models to air pollution and mortality data from Greater London. A Bayesian approach is taken throughout, and a Markov chain monte carlo simulation algorithm is presented for inference. An alternative likelihood based analysis is also presented, in order to allow a direct comparison with the only previous analysis of air pollution and health data using a DGLM. Key words dynamic generalised linear model, Bayesian analysis, Markov chain monte carlo simulation, air pollution 2 1