Results 1  10
of
81
Has the U.S. Economy Become More Stable? A Bayesian Approach Based on a MarkovSwitching Model of Business Cycle
, 1999
"... We hope to be able to provide answers to the following questions: 1) Has there been a structural break in postwar U.S. real GDP growth toward more stabilization? 2) If so, when would it have been? 3) What's the nature of the structural break? For this purpose, we employ a Bayesian approach to d ..."
Abstract

Cited by 322 (13 self)
 Add to MetaCart
We hope to be able to provide answers to the following questions: 1) Has there been a structural break in postwar U.S. real GDP growth toward more stabilization? 2) If so, when would it have been? 3) What's the nature of the structural break? For this purpose, we employ a Bayesian approach to dealing with structural break at an unknown changepoint in a Markovswitching model of business cycle. Empirical results suggest that there has been a structural break in U.S. real GDP growth toward more stabilization, with the posterior mode of the break date around 1984:1. Furthermore, we #nd a narrowing gap between growth rates during recessions and booms is at least as important as a decline in the volatility of shocks. Key Words: Bayes Factor, Gibbs sampling, Marginal Likelihood, MarkovSwitching, Stabilization, Structural Break. JEL Classi#cations: C11, C12, C22, E32. 1. Introduction In the literature, the issue of postwar stabilization of the U.S. economy relative to the prewar period has...
Model selection and model averaging in phylogenetics: Advantages of the AIC and Bayesian approaches over likelihood ratio tests. Syst. Biol
, 2004
"... Abstract.—Model selection is a topic of special relevance in molecular phylogenetics that affects many, if not all, stages of phylogenetic inference. Here we discuss some fundamental concepts and techniques of model selection in the context of phylogenetics. We start by reviewing different aspects o ..."
Abstract

Cited by 180 (5 self)
 Add to MetaCart
(Show Context)
Abstract.—Model selection is a topic of special relevance in molecular phylogenetics that affects many, if not all, stages of phylogenetic inference. Here we discuss some fundamental concepts and techniques of model selection in the context of phylogenetics. We start by reviewing different aspects of the selection of substitution models in phylogenetics from a theoretical, philosophical and practical point of view, and summarize this comparison in table format. We argue that the most commonly implemented model selection approach, the hierarchical likelihood ratio test, is not the optimal strategy for model selection in phylogenetics, and that approaches like the Akaike Information Criterion (AIC) and Bayesian methods offer important advantages. In particular, the latter two methods are able to simultaneously compare multiple nested or nonnested models, assess model selection uncertainty, and allow for the estimation of phylogenies and model parameters using all available models (modelaveraged inference or multimodel inference). We also describe how the relative importance of the different parameters included in substitution models can be depicted. To illustrate some of these points, we have applied AICbased model averaging to 37 mitochondrial DNA sequences from the subgenus Ohomopterus (genus Carabus) ground beetles described by Sota and Vogler (2001). [AIC; Bayes factors; BIC; likelihood ratio tests; model averaging; model uncertainty; model selection; multimodel inference.] It is clear that models of nucleotide substitution (henceforth models of evolution) play a significant role
Bayesian phylogenetic analysis of combined data
 Syst. Biol
, 2004
"... Abstract. — The recent development of Bayesian phylogenetic inference using Markov chain Monte Carlo (MCMC) techniques has facilitated the exploration of parameterrich evolutionary models. At the same time, stochastic models have become more realistic (and complex) and have been extended to new typ ..."
Abstract

Cited by 117 (5 self)
 Add to MetaCart
(Show Context)
Abstract. — The recent development of Bayesian phylogenetic inference using Markov chain Monte Carlo (MCMC) techniques has facilitated the exploration of parameterrich evolutionary models. At the same time, stochastic models have become more realistic (and complex) and have been extended to new types of data, such as morphology. Based on this foundation, we developed a Bayesian MCMC approach to the analysis of combined data sets and explored its utility in inferring relationships among gall wasps based on data from morphology and four genes (nuclear and mitochondrial, ribosomal and protein coding). Examined models range in complexity from those recognizing only a morphological and a molecular partition to those having complex substitution models with independent parameters for each gene. Bayesian MCMC analysis deals efficiently with complex models: convergence occurs faster and more predictably for complex models, mixing is adequate for all parameters even under very complex models, and the parameter update cycle is virtually unaffected by model partitioning across sites. Morphology contributed only 5 % of the characters in the data set but nevertheless influenced the combineddata tree, supporting the utility of morphological data in multigene analyses. We used Bayesian criteria (Bayes factors) to show that process heterogeneity across data partitions is a significant model component, although not as important as amongsite rate variation. More complex evolutionary models are associated with more topological uncertainty and less conflict between morphology and molecules. Bayes factors sometimes favor simpler models over considerably more
H: Computing Bayes factors using thermodynamic integration
 Syst Biol
"... Abstract.—In the Bayesian paradigm, a common method for comparing two models is to compute the Bayes factor, defined as the ratio of their respective marginal likelihoods. In recent phylogenetic works, the numerical evaluation of marginal likelihoods has often been performed using the harmonic mean ..."
Abstract

Cited by 42 (6 self)
 Add to MetaCart
(Show Context)
Abstract.—In the Bayesian paradigm, a common method for comparing two models is to compute the Bayes factor, defined as the ratio of their respective marginal likelihoods. In recent phylogenetic works, the numerical evaluation of marginal likelihoods has often been performed using the harmonic mean estimation procedure. In the present article, we propose to employ another method, based on an analogy with statistical physics, called thermodynamic integration. We describe the method, propose an implementation, and show on two analytical examples that this numerical method yields reliable estimates. In contrast, the harmonic mean estimator leads to a strong overestimation of the marginal likelihood, which is all the more pronounced as the model is higher dimensional. As a result, the harmonic mean estimator systematically favors more parameterrich models, an artefact that might explain some recent puzzling observations, based on harmonic mean estimates, suggesting that Bayes factors tend to overscore complex models. Finally, we apply our method to the comparison of several alternative models of aminoacid replacement. We confirm our previous observations, indicating that modeling pattern heterogeneity across sites tends to yield better models than standard empirical matrices. [Bayes factor; harmonic mean; mixture model; path sampling; phylogeny; thermodynamic integration.] Bayesian methods have become popular in molecular phylogenetics over the recent years. The simple and intuitive interpretation of the concept of probabilities
Bayesian Simultaneous Equations Analysis using Reduced Structures
, 1997
"... Diffuse priors lead to pathological posterior behavior when used in Bayesian analyses of Simultaneous Equation Models (SEMs). This results from the local nonidentication of certain parameters in SEMs. When this, a priori known, feature is not captured appropriately, an a posteriori favor for certain ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
Diffuse priors lead to pathological posterior behavior when used in Bayesian analyses of Simultaneous Equation Models (SEMs). This results from the local nonidentication of certain parameters in SEMs. When this, a priori known, feature is not captured appropriately, an a posteriori favor for certain specific parameter values results which is not the consequence of strong data information but of local nonidentification. We show that a proper consistent Bayesian analysis of a SEM explicitly has to consider the reduced form of the SEM as a standard linear model on which nonlinear (reduced rank) restrictions are imposed, which result from a singular value decomposition. The priors/posteriors of the parameters of the SEM are therefore proportional to the priors/posteriors of the parameters of the linear model under the condition that the restrictions hold. This leads to a framework for constructing priors and posteriors for the parameters of SEMs. The framework is used to construct priors and pos...
Estimating the integrated likelihood via posterior simulation using the harmonic mean identity
 Bayesian Statistics
, 2007
"... The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison a ..."
Abstract

Cited by 26 (2 self)
 Add to MetaCart
(Show Context)
The integrated likelihood (also called the marginal likelihood or the normalizing constant) is a central quantity in Bayesian model selection and model averaging. It is defined as the integral over the parameter space of the likelihood times the prior density. The Bayes factor for model comparison and Bayesian testing is a ratio of integrated likelihoods, and the model weights in Bayesian model averaging are proportional to the integrated likelihoods. We consider the estimation of the integrated likelihood from posterior simulation output, aiming at a generic method that uses only the likelihoods from the posterior simulation iterations. The key is the harmonic mean identity, which says that the reciprocal of the integrated likelihood is equal to the posterior harmonic mean of the likelihood. The simplest estimator based on the identity is thus the harmonic mean of the likelihoods. While this is an unbiased and simulationconsistent estimator, its reciprocal can have infinite variance and so it is unstable in general. We describe two methods for stabilizing the harmonic mean estimator. In the first one, the parameter space is reduced in such a way that the modified estimator involves a harmonic mean of heaviertailed densities, thus resulting in a finite variance estimator. The resulting
Priors, Posteriors and Bayes Factors for a Bayesian Analysis of Cointegration
 Journal of Econometrics
, 1999
"... Cointegration occurs when the long run multiplier of a vector autoregressive model exhibits rank reduction. Priors and posteriors of the parameters of the cointegration model are therefore proportional to priors and posteriors of the long run multiplier given that it has reduced rank. Rank reduction ..."
Abstract

Cited by 26 (3 self)
 Add to MetaCart
(Show Context)
Cointegration occurs when the long run multiplier of a vector autoregressive model exhibits rank reduction. Priors and posteriors of the parameters of the cointegration model are therefore proportional to priors and posteriors of the long run multiplier given that it has reduced rank. Rank reduction of the long run multiplier is modelled using a decomposition resulting from its singular value decomposition. It specifies the long run multiplier matrix as the sum of a matrix that equals the product of the adjustment parameters and the cointegrating vectors, i.e. the cointegration specification, and a matrix that models the deviation from cointegration. Priors and posteriors for the parameters of the cointegration model are obtained by restricting the latter matrix to zero in the prior and posterior of the unrestricted long run multiplier. The special decomposition of the long run multiplier results in unique posterior densities. This theory leads to a complete Bayesian framework for cointegration analysis. It includes prior specification, simulation schemes for obtaining posterior distributions and determination of the cointegration rank via Bayes factors. We illustrate the analysis with several simulated series, the UK data
Estimating Ratios of Normalizing Constants for Densities With Different Dimensions
 Statistica Sinica
, 1997
"... Abstract: In Bayesian inference, a Bayes factor is defined as the ratio of posterior odds versus prior odds where posterior odds is simply a ratio of the normalizing constants of two posterior densities. In many practical problems, the two posteriors have different dimensions. For such cases, the cu ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
Abstract: In Bayesian inference, a Bayes factor is defined as the ratio of posterior odds versus prior odds where posterior odds is simply a ratio of the normalizing constants of two posterior densities. In many practical problems, the two posteriors have different dimensions. For such cases, the current Monte Carlo methods such as the bridge sampling method (Meng and Wong (1996)), the path sampling method (Gelman and Meng (1994)), and the ratio importance sampling method (Chen and Shao (1997)) cannot directly be applied. In this article, we extend importance sampling, bridge sampling, and ratio importance sampling to problems of different dimensions. Then we find global optimal importance sampling, bridge sampling, and ratio importance sampling in the sense of minimizing asymptotic relative meansquare errors of estimators. Implementation algorithms, which can asymptotically achieve the optimal simulation errors, are developed and two illustrative examples are also provided.
Warp bridge sampling
 J. Comp. Graph. Statist
, 2002
"... Bridge sampling, a general formulation of the acceptance ratio method in physics for computing freeenergy difference, is an effective Monte Carlo method for computing normalizingconstantsof probabilitymodels. The method was originallyproposedfor cases where the probabilitymodels have overlappingsup ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
(Show Context)
Bridge sampling, a general formulation of the acceptance ratio method in physics for computing freeenergy difference, is an effective Monte Carlo method for computing normalizingconstantsof probabilitymodels. The method was originallyproposedfor cases where the probabilitymodels have overlappingsupport. Voter proposed the idea of shifting physical systems before applying the acceptance ratio method to calculate freeenergy differencesbetween systems that are highlyseparatedin a con � guration space.The purpose of this article is to push Voter’s idea further by applying more general transformations, including stochastic transformations resulting from mixing over transformation groups, to the underlying variables before performing bridge sampling. We term such methods warp bridgesampling to highlightthe fact that in addition to location shifting (i.e., centering)one can further reduce the difference/distance between two densities by warping their shapes without changing the normalizing constants. Real databased empirical studies using the fullinformationitem factor modeland a nonlinearmixed model are providedto demonstrate the potentially substantial gains in Monte Carlo ef � ciency by going beyond centering and by using ef � cient bridge sampling estimators. Our general method is also applicable to a couple of recent proposals for computing marginal likelihoods and Bayes factors because these methods turn out to be covered by the general bridge sampling framework.