Results 1  10
of
49
Has the U.S. Economy Become More Stable? A Bayesian Approach Based on a MarkovSwitching Model of Business Cycle
, 1999
"... We hope to be able to provide answers to the following questions: 1) Has there been a structural break in postwar U.S. real GDP growth toward more stabilization? 2) If so, when would it have been? 3) What's the nature of the structural break? For this purpose, we employ a Bayesian approach to dealin ..."
Abstract

Cited by 255 (13 self)
 Add to MetaCart
We hope to be able to provide answers to the following questions: 1) Has there been a structural break in postwar U.S. real GDP growth toward more stabilization? 2) If so, when would it have been? 3) What's the nature of the structural break? For this purpose, we employ a Bayesian approach to dealing with structural break at an unknown changepoint in a Markovswitching model of business cycle. Empirical results suggest that there has been a structural break in U.S. real GDP growth toward more stabilization, with the posterior mode of the break date around 1984:1. Furthermore, we #nd a narrowing gap between growth rates during recessions and booms is at least as important as a decline in the volatility of shocks. Key Words: Bayes Factor, Gibbs sampling, Marginal Likelihood, MarkovSwitching, Stabilization, Structural Break. JEL Classi#cations: C11, C12, C22, E32. 1. Introduction In the literature, the issue of postwar stabilization of the U.S. economy relative to the prewar period has...
H: Computing Bayes factors using thermodynamic integration
 Syst Biol
"... Abstract.—In the Bayesian paradigm, a common method for comparing two models is to compute the Bayes factor, defined as the ratio of their respective marginal likelihoods. In recent phylogenetic works, the numerical evaluation of marginal likelihoods has often been performed using the harmonic mean ..."
Abstract

Cited by 33 (5 self)
 Add to MetaCart
Abstract.—In the Bayesian paradigm, a common method for comparing two models is to compute the Bayes factor, defined as the ratio of their respective marginal likelihoods. In recent phylogenetic works, the numerical evaluation of marginal likelihoods has often been performed using the harmonic mean estimation procedure. In the present article, we propose to employ another method, based on an analogy with statistical physics, called thermodynamic integration. We describe the method, propose an implementation, and show on two analytical examples that this numerical method yields reliable estimates. In contrast, the harmonic mean estimator leads to a strong overestimation of the marginal likelihood, which is all the more pronounced as the model is higher dimensional. As a result, the harmonic mean estimator systematically favors more parameterrich models, an artefact that might explain some recent puzzling observations, based on harmonic mean estimates, suggesting that Bayes factors tend to overscore complex models. Finally, we apply our method to the comparison of several alternative models of aminoacid replacement. We confirm our previous observations, indicating that modeling pattern heterogeneity across sites tends to yield better models than standard empirical matrices. [Bayes factor; harmonic mean; mixture model; path sampling; phylogeny; thermodynamic integration.] Bayesian methods have become popular in molecular phylogenetics over the recent years. The simple and intuitive interpretation of the concept of probabilities
Estimating the integrated likelihood via posterior simulation using the harmonic mean identity
 Bayesian Statistics
, 2007
"... The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison a ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison and Bayesian testing is a ratio of integrated likelihoods, and the model weights in Bayesian model averaging are proportional to the integrated likelihoods. We consider the estimation of the integrated likelihood from posterior simulation output, aiming at a generic method that uses only the likelihoods from the posterior simulation iterations. The key is the harmonic mean identity, which says that the reciprocal of the integrated likelihood is equal to the posterior harmonic mean of the likelihood. The simplest estimator based on the identity is thus the harmonic mean of the likelihoods. While this is an unbiased and simulationconsistent estimator, its reciprocal can have infinite variance and so it is unstable in general. We describe two methods for stabilizing the harmonic mean estimator. In the first one, the parameter space is reduced in such a way that the modified estimator involves a harmonic mean of heaviertailed densities, thus resulting in a finite variance estimator. The resulting
Bayesian Simultaneous Equations Analysis using Reduced Structures
, 1997
"... Diffuse priors lead to pathological posterior behavior when used in Bayesian analyses of Simultaneous Equation Models (SEMs). This results from the local nonidentication of certain parameters in SEMs. When this, a priori known, feature is not captured appropriately, an a posteriori favor for certain ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
Diffuse priors lead to pathological posterior behavior when used in Bayesian analyses of Simultaneous Equation Models (SEMs). This results from the local nonidentication of certain parameters in SEMs. When this, a priori known, feature is not captured appropriately, an a posteriori favor for certain specific parameter values results which is not the consequence of strong data information but of local nonidentification. We show that a proper consistent Bayesian analysis of a SEM explicitly has to consider the reduced form of the SEM as a standard linear model on which nonlinear (reduced rank) restrictions are imposed, which result from a singular value decomposition. The priors/posteriors of the parameters of the SEM are therefore proportional to the priors/posteriors of the parameters of the linear model under the condition that the restrictions hold. This leads to a framework for constructing priors and posteriors for the parameters of SEMs. The framework is used to construct priors and pos...
Priors, Posteriors and Bayes Factors for a Bayesian Analysis of Cointegration
 Journal of Econometrics
, 1999
"... Cointegration occurs when the long run multiplier of a vector autoregressive model exhibits rank reduction. Priors and posteriors of the parameters of the cointegration model are therefore proportional to priors and posteriors of the long run multiplier given that it has reduced rank. Rank reduction ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
Cointegration occurs when the long run multiplier of a vector autoregressive model exhibits rank reduction. Priors and posteriors of the parameters of the cointegration model are therefore proportional to priors and posteriors of the long run multiplier given that it has reduced rank. Rank reduction of the long run multiplier is modelled using a decomposition resulting from its singular value decomposition. It specifies the long run multiplier matrix as the sum of a matrix that equals the product of the adjustment parameters and the cointegrating vectors, i.e. the cointegration specification, and a matrix that models the deviation from cointegration. Priors and posteriors for the parameters of the cointegration model are obtained by restricting the latter matrix to zero in the prior and posterior of the unrestricted long run multiplier. The special decomposition of the long run multiplier results in unique posterior densities. This theory leads to a complete Bayesian framework for cointegration analysis. It includes prior specification, simulation schemes for obtaining posterior distributions and determination of the cointegration rank via Bayes factors. We illustrate the analysis with several simulated series, the UK data
Estimating Ratios of Normalizing Constants for Densities with Different Dimensions
 STATISTICA SINICA
, 1997
"... In Bayesian inference, a Bayes factor is defined as the ratio of posterior odds versus prior odds where posterior odds is simply a ratio of the normalizing constants of two posterior densities. In many practical problems, the two posteriors have different dimensions. For such cases, the current Mont ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
In Bayesian inference, a Bayes factor is defined as the ratio of posterior odds versus prior odds where posterior odds is simply a ratio of the normalizing constants of two posterior densities. In many practical problems, the two posteriors have different dimensions. For such cases, the current Monte Carlo methods such as the bridge sampling method (Meng and Wong 1996), the path sampling method (Gelman and Meng 1994), and the ratio importance sampling method (Chen and Shao 1994) cannot directly be applied. In this article, we extend importance sampling, bridge sampling, and ratio importance sampling to problems of different dimensions. Then we find global optimal importance sampling, bridge sampling, and ratio importance sampling in the sense of minimizing asymptotic relative meansquare errors of estimators. Implementation algorithms, which can asymptotically achieve the optimal simulation errors, are developed and two illustrative examples are also provided.
Warp bridge sampling
 J. Comp. Graph. Statist
, 2002
"... Bridge sampling, a general formulation of the acceptance ratio method in physics for computing freeenergy difference, is an effective Monte Carlo method for computing normalizingconstantsof probabilitymodels. The method was originallyproposedfor cases where the probabilitymodels have overlappingsup ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Bridge sampling, a general formulation of the acceptance ratio method in physics for computing freeenergy difference, is an effective Monte Carlo method for computing normalizingconstantsof probabilitymodels. The method was originallyproposedfor cases where the probabilitymodels have overlappingsupport. Voter proposed the idea of shifting physical systems before applying the acceptance ratio method to calculate freeenergy differencesbetween systems that are highlyseparatedin a con � guration space.The purpose of this article is to push Voter’s idea further by applying more general transformations, including stochastic transformations resulting from mixing over transformation groups, to the underlying variables before performing bridge sampling. We term such methods warp bridgesampling to highlightthe fact that in addition to location shifting (i.e., centering)one can further reduce the difference/distance between two densities by warping their shapes without changing the normalizing constants. Real databased empirical studies using the fullinformationitem factor modeland a nonlinearmixed model are providedto demonstrate the potentially substantial gains in Monte Carlo ef � ciency by going beyond centering and by using ef � cient bridge sampling estimators. Our general method is also applicable to a couple of recent proposals for computing marginal likelihoods and Bayes factors because these methods turn out to be covered by the general bridge sampling framework.
Testing for Integration using evolving Trend and Seasonals Models: A Bayesian Approach
, 1999
"... In this paper, we make use of state space models to investigate the presence of stochastic trends in economic time series. A model is specified where such a trend can enter either in the autoregressive representation or in a separate state equation. Tests based on the former are analogous to Dickey ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
In this paper, we make use of state space models to investigate the presence of stochastic trends in economic time series. A model is specified where such a trend can enter either in the autoregressive representation or in a separate state equation. Tests based on the former are analogous to DickeyFuller tests of unit roots, while the latter are analogous to KPSS tests of trendstationarity. We use Bayesian methods to survey the properties of the likelihood function in such models and to calculate posterior odds ratios comparing models with and without stochastic trends. We extend these ideas to the problem of testing for integration at seasonal frequencies and showhow our techniques can be used to carry out Bayesian variants of either the HEGY or CanovaHansen test. Stochastic integration rules, based on Markov Chain Monte Carlo, as well as deterministic integration rules are used. Strengths and weaknesses of each approach are indicated.
Continuous Contour Monte Carlo for Marginal Density Estimation with an Application to Spatial Statistical Model
, 2006
"... The problem of marginal density estimation for a multivariate density function f(x) can be generally stated as a problem of density function estimation for a random vector λ(x) of dimension lower than that of x. In this paper, we propose a technique, the socalled continuous Contour Monte Carlo (CCM ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
The problem of marginal density estimation for a multivariate density function f(x) can be generally stated as a problem of density function estimation for a random vector λ(x) of dimension lower than that of x. In this paper, we propose a technique, the socalled continuous Contour Monte Carlo (CCMC) algorithm, for solving this problem. CCMC can be viewed as a continuous version of the contour Monte Carlo (CMC) algorithm recently proposed in the literature. CCMC abandons the use of sample space partitioning and incorporates the techniques of kernel density estimation into its simulations. CCMC is more general than other marginal density estimation algorithms. First, it works for any density functions, even for those having a rugged or unbalanced energy landscape. Second, it works for any transformation λ(x) regardless of the availability of the analytical form of the inverse transformation. In this paper, CCMC is applied to estimate the unknown normalizing constant function for a spatial autologistic model, and the estimate is then used in a Bayesian analysis for the spatial autologistic model in place of the true normalizing constant function. Numerical results on the US cancer mortality data indicate that the Bayesian method can produce much more accurate estimates than the MPLE and MCMLE methods for the parameters of the spatial autologistic model.