Results 1  10
of
26
Estimating the integrated likelihood via posterior simulation using the harmonic mean identity
 Bayesian Statistics
, 2007
"... The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison a ..."
Abstract

Cited by 42 (2 self)
 Add to MetaCart
The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison and Bayesian testing is a ratio of integrated likelihoods, and the model weights in Bayesian model averaging are proportional to the integrated likelihoods. We consider the estimation of the integrated likelihood from posterior simulation output, aiming at a generic method that uses only the likelihoods from the posterior simulation iterations. The key is the harmonic mean identity, which says that the reciprocal of the integrated likelihood is equal to the posterior harmonic mean of the likelihood. The simplest estimator based on the identity is thus the harmonic mean of the likelihoods. While this is an unbiased and simulationconsistent estimator, its reciprocal can have infinite variance and so it is unstable in general. We describe two methods for stabilizing the harmonic mean estimator. In the first one, the parameter space is reduced in such a way that the modified estimator involves a harmonic mean of heaviertailed densities, thus resulting in a finite variance estimator. The resulting
Bayesian Simultaneous Equations Analysis using Reduced Structures
, 1997
"... Diffuse priors lead to pathological posterior behavior when used in Bayesian analyses of Simultaneous Equation Models (SEMs). This results from the local nonidentication of certain parameters in SEMs. When this, a priori known, feature is not captured appropriately, an a posteriori favor for certain ..."
Abstract

Cited by 32 (3 self)
 Add to MetaCart
Diffuse priors lead to pathological posterior behavior when used in Bayesian analyses of Simultaneous Equation Models (SEMs). This results from the local nonidentication of certain parameters in SEMs. When this, a priori known, feature is not captured appropriately, an a posteriori favor for certain specific parameter values results which is not the consequence of strong data information but of local nonidentification. We show that a proper consistent Bayesian analysis of a SEM explicitly has to consider the reduced form of the SEM as a standard linear model on which nonlinear (reduced rank) restrictions are imposed, which result from a singular value decomposition. The priors/posteriors of the parameters of the SEM are therefore proportional to the priors/posteriors of the parameters of the linear model under the condition that the restrictions hold. This leads to a framework for constructing priors and posteriors for the parameters of SEMs. The framework is used to construct priors and pos...
Priors, Posteriors and Bayes Factors for a Bayesian Analysis of Cointegration
 Journal of Econometrics
, 1999
"... Cointegration occurs when the long run multiplier of a vector autoregressive model exhibits rank reduction. Priors and posteriors of the parameters of the cointegration model are therefore proportional to priors and posteriors of the long run multiplier given that it has reduced rank. Rank reduction ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
Cointegration occurs when the long run multiplier of a vector autoregressive model exhibits rank reduction. Priors and posteriors of the parameters of the cointegration model are therefore proportional to priors and posteriors of the long run multiplier given that it has reduced rank. Rank reduction of the long run multiplier is modelled using a decomposition resulting from its singular value decomposition. It specifies the long run multiplier matrix as the sum of a matrix that equals the product of the adjustment parameters and the cointegrating vectors, i.e. the cointegration specification, and a matrix that models the deviation from cointegration. Priors and posteriors for the parameters of the cointegration model are obtained by restricting the latter matrix to zero in the prior and posterior of the unrestricted long run multiplier. The special decomposition of the long run multiplier results in unique posterior densities. This theory leads to a complete Bayesian framework for cointegration analysis. It includes prior specification, simulation schemes for obtaining posterior distributions and determination of the cointegration rank via Bayes factors. We illustrate the analysis with several simulated series, the UK data
Bayesian Variable Selection for Proportional Hazards Models
, 1996
"... The authors consider the problem of Bayesian variable selection for proportional hazards regression models with right censored data. They propose a semiparametric approach in which a nonparametric prior is specified for the baseline hazard rate and a fully parametric prior is specified for the regr ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
The authors consider the problem of Bayesian variable selection for proportional hazards regression models with right censored data. They propose a semiparametric approach in which a nonparametric prior is specified for the baseline hazard rate and a fully parametric prior is specified for the regression coe#cients. For the baseline hazard, they use a discrete gamma process prior, and for the regression coe#cients and the model space, they propose a semiautomatic parametric informative prior specification that focuses on the observables rather than the parameters. To implement the methodology, they propose a Markov chain Monte Carlo method to compute the posterior model probabilities. Examples using simulated and real data are given to demonstrate the methodology. R ESUM E Les auteurs abordent d'un point de vue bayesien le problemedelaselection de variables dans les modeles de regression des risques proportionnels en presence de censure a droite. Ils proposent une approche semip...
Bayes Estimates of Markov Trends in Possibly Cointegrated Series: An Application to US Consumption and Income
 JOURNAL OF BUSINESS AND ECONOMIC STATISTICS
, 1999
"... Stylized facts show that average growth rates of US per capita consumption and income differ in recession and expansion periods. Since a linear combination of such series does not have to be a constant mean process, standard cointegration analysis between the variables to examine the permanent in ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
Stylized facts show that average growth rates of US per capita consumption and income differ in recession and expansion periods. Since a linear combination of such series does not have to be a constant mean process, standard cointegration analysis between the variables to examine the permanent income hypothesis may not be valid. To model the changing growth rates in both series, we introduce a multivariate Markov trend model, which accounts for di#erent growth rates in consumption and income during expansions and recessions and across variables within both regimes. The deviations from the multivariate Markov trend are modeled by a vector autoregressive model. Bayes estimates of this model are obtained using Markov chain Monte Carlo methods. The empirical results suggest the existence of a cointegration relation between US per capita disposable income and consumption, after correction for a multivariate Markov trend. This results is also obtained when per capita investment is added to the vector autoregression.
Continuous contour Monte Carlo for marginal density estimation with an application to a spatial statistical model
 Journal of Computational and Graphical Statistics
, 2007
"... The problem of marginal density estimation for a multivariate density function f(x) can be generally stated as a problem of density function estimation for a random vector λ(x) of dimension lower than that of x. In this article, we propose a technique, the socalled continuous Contour Monte Carlo (C ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
The problem of marginal density estimation for a multivariate density function f(x) can be generally stated as a problem of density function estimation for a random vector λ(x) of dimension lower than that of x. In this article, we propose a technique, the socalled continuous Contour Monte Carlo (CCMC) algorithm, for solving this problem. CCMC can be viewed as a continuous version of the contour Monte Carlo (CMC) algorithm recently proposed in the literature. CCMC abandons the use of sample space partitioning and incorporates the techniques of kernel density estimation into its simulations. CCMC is more general than other marginal density estimation algorithms. First, it works for any density functions, even for those having a rugged or unbalanced energy landscape. Second, it works for any transformation λ(x) regardless of the availability of the analytical form of the inverse transformation. In this article, CCMC is applied to estimate the unknown normalizing constant function for a spatial autologistic model, and the estimate is then used in a Bayesian analysis for the spatial autologistic model in place of the true normalizing constant function. Numerical results on the U.S. cancer mortality data indicate that the Bayesian method can produce much more accurate estimates than the MPLE and MCMLE methods for the parameters of the spatial autologistic model.
Bayesian variable selection for time series count data Statistica Sinica
, 2000
"... Abstract: We consider a parametric model for time series of counts by constructing a likelihoodbased generalization of a model considered by Zeger (1988). We consider a Bayesian approach and propose a class of informative prior distributions for the model parameters that are useful for variable su ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract: We consider a parametric model for time series of counts by constructing a likelihoodbased generalization of a model considered by Zeger (1988). We consider a Bayesian approach and propose a class of informative prior distributions for the model parameters that are useful for variable subset selection. The prior specification is motivated from the notion of the existence of data from similar previous studies, called historical data, which is then quantified in a prior distribution for the current study. We derive theoretical and computational properties of the proposed priors and develop novel methods for computing posterior model probabilities. To compute the posterior model probabilities, we show that only posterior samples from the full model are needed to estimate the posterior probabilities for all of the possible subset models. We demonstrate our methodology with a simulated and a real data set.
Variable selection for multivariate logistic regression models
 Journal of Statistical Planning and Inference
, 2003
"... In this paper, we use multivariate logistic regression models to incorporate correlation among binary response data. Our objective is to develop a variable subset selection procedure to identify important covariates in predicting correlated binary responses using a Bayesian approach. In order to inc ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In this paper, we use multivariate logistic regression models to incorporate correlation among binary response data. Our objective is to develop a variable subset selection procedure to identify important covariates in predicting correlated binary responses using a Bayesian approach. In order to incorporate available prior information, we propose a class of informative prior distributions on the model parameters and on the model space. The propriety of the proposed informative prior is investigated in detail. Novel computational algorithms are also developed for sampling from the posterior distribution as well as for computing posterior model probabilities. Finally, a simulated data example and a real data example from a prostate cancer study are used to illustrate the proposed methodology.
Monte Carlo StateSpace Likelihoods by Weighted Posterior Lernel Density Estimation
 Journal of the American Statistical Association
, 2004
"... Maximum likelihood estimation and likelihood ratio tests for nonlinear, nonGaussian statespace models require numerical integration for likelihood calculations. Several methods, including Monte Carlo (MC) expectation maximization, MC likelihood ratios, direct MC integration, and particle lter li ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Maximum likelihood estimation and likelihood ratio tests for nonlinear, nonGaussian statespace models require numerical integration for likelihood calculations. Several methods, including Monte Carlo (MC) expectation maximization, MC likelihood ratios, direct MC integration, and particle lter likelihoods, are inef cient for the motivating problem of stagestructured population dynamics models in experimental settings. An MC kernel likelihood (MCKL) method is presented that estimates classical likelihoods up to a constant by weighted kernel density estimates of Bayesian posteriors. MCKL is derived by using Bayesian posteriors as importance sampling densities for unnormalized kernel smoothing integrals. MC error and mode bias due to kernel smoothing are discussed and two methods for reducing mode bias are proposed: “zooming in ” on the maximum likelihood parameters using a focused prior based on an initial estimate and using a posterior cumulantbased approximation of mode bias. A simulated example shows that MCKL can be much more ef cient than previous approaches for the population dynamics problem. The zoomingin and cumulantbased corrections are illustrated with a multivariate variance estimation problem for which accurate results are obtained even in 20 parameter dimensions.