Results 1  10
of
179
Bayesian Data Analysis
, 1995
"... I actually own a copy of Harold Jeffreys’s Theory of Probability but have only read small bits of it, most recently over a decade ago to confirm that, indeed, Jeffreys was not too proud to use a classical chisquared pvalue when he wanted to check the misfit of a model to data (Gelman, Meng and Ste ..."
Abstract

Cited by 1508 (49 self)
 Add to MetaCart
I actually own a copy of Harold Jeffreys’s Theory of Probability but have only read small bits of it, most recently over a decade ago to confirm that, indeed, Jeffreys was not too proud to use a classical chisquared pvalue when he wanted to check the misfit of a model to data (Gelman, Meng and Stern, 2006). I do, however, feel that it is important to understand where our probability models come from, and I welcome the opportunity to use the present article by Robert, Chopin and Rousseau as a platform for further discussion of foundational issues. 2 In this brief discussion I will argue the following: (1) in thinking about prior distributions, we should go beyond Jeffreys’s principles and move toward weakly informative priors; (2) it is natural for those of us who work in social and computational sciences to favor complex models, contra Jeffreys’s preference for simplicity; and (3) a key generalization of Jeffreys’s ideas is to explicitly include model checking in the process of data analysis.
Mplus: Statistical Analysis with Latent Variables (Version 4.21) [Computer software
, 2007
"... Chapter 3: Regression and path analysis 19 Chapter 4: Exploratory factor analysis 43 ..."
Abstract

Cited by 51 (0 self)
 Add to MetaCart
Chapter 3: Regression and path analysis 19 Chapter 4: Exploratory factor analysis 43
Modeling changing dependency structure in multivariate time series
 In International Conference in Machine Learning
, 2007
"... We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP segmenta ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
We show how to apply the efficient Bayesian changepoint detection techniques of Fearnhead in the multivariate setting. We model the joint density of vectorvalued observations using undirected Gaussian graphical models, whose structure we estimate. We show how we can exactly compute the MAP segmentation, as well as how to draw perfect samples from the posterior over segmentations, simultaneously accounting for uncertainty about the number and location of changepoints, as well as uncertainty about the covariance structure. We illustrate the technique by applying it to financial data and to bee tracking data. 1.
An Evaluation of Four SacramentoSan Joaquin River Delta Juvenile Salmon Survival Studies
, 1991
"... conducted several multiyear releaserecovery experiments with codedwiretagged juvenile Chinook salmon. The objectives of the studies were (1) to estimate survival through the lower portions of the Sacramento and San Joaquin river systems, the California Delta, and (2) to quantify the factors affe ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
conducted several multiyear releaserecovery experiments with codedwiretagged juvenile Chinook salmon. The objectives of the studies were (1) to estimate survival through the lower portions of the Sacramento and San Joaquin river systems, the California Delta, and (2) to quantify the factors affecting survival. Four of these studies, listed more or less by their historical start dates, are the Delta Cross Channel, Interior, Delta Action 8, and VAMP experiments. Delta Cross Channel: These studies focused on how the position of the Delta crosschannel (DCC) gate affected survival of outmigrating juvenile salmon. When the gate(s) is open, water flow from the Sacramento river into the central Delta increases. The a priori hypothesis for these studies was that survival would be lowered with the gate open since the probability of entering the interior Delta would increase and the fish would thereby be more vulnerable to the water export pumps at the state water project (SWP) and at the federal Central Valley project (CVP). Temporally paired releases were made above the DCC (near Courtland) and below the DCC (at Ryde) and recoveries were made at Chipps Island and in the ocean fisheries.
Bayesian analysis of latent variable models using Mplus. Version 4. Retrieved from http://www.statmodel.com/download/BayesAdvantages18.pdf Asparouhov
 University of Barcelona
, 1996
"... In this paper we describe some of the modeling possibilities that are now available in Mplus Version 6 with the Bayesian methodology. This new methodology offers many new possibilities but also many challenges. The paper is intended to spur more research rather than to provide complete an ..."
Abstract

Cited by 14 (7 self)
 Add to MetaCart
In this paper we describe some of the modeling possibilities that are now available in Mplus Version 6 with the Bayesian methodology. This new methodology offers many new possibilities but also many challenges. The paper is intended to spur more research rather than to provide complete an
Responses to monetary policy shocks in the east and the west of europe: a comparison,” Center for Social and Economic Research 287
"... This paper compares impulse responses to monetary policy shocks in the euro area countries before the EMU and in the New Member States (NMS) from centraleastern Europe. We mitigate the smallsample problem, which is especially acute for the NMS, by using a Bayesian estimation that combines informati ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
This paper compares impulse responses to monetary policy shocks in the euro area countries before the EMU and in the New Member States (NMS) from centraleastern Europe. We mitigate the smallsample problem, which is especially acute for the NMS, by using a Bayesian estimation that combines information across countries. The impulse responses in the NMS are broadly similar to those in the euro area countries. There is some evidence that in the NMS, which have had higher and more volatile inflation, the Phillips curve is steeper than in the euro area countries. This finding is consistent with economic theory.
Struggles with Survey Weighting and Regression Modeling
 Statistical Science
, 2007
"... Abstract. The general principles of Bayesian data analysis imply that models for survey responses should be constructed conditional on all variables that affect the probability of inclusion and nonresponse, which are also the variables used in survey weighting and clustering. However, such models ca ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Abstract. The general principles of Bayesian data analysis imply that models for survey responses should be constructed conditional on all variables that affect the probability of inclusion and nonresponse, which are also the variables used in survey weighting and clustering. However, such models can quickly become very complicated, with potentially thousands of poststratification cells. It is then a challenge to develop general families of multilevel probability models that yield reasonable Bayesian inferences. We discuss in the context of several ongoing public health and social surveys. This work is currently openended, and we conclude with thoughts on how research could proceed to solve these problems. Multilevel modeling, poststratification, samKey words and phrases:
Why we (usually) don’t have to worry about multiple comparisons ∗
, 2008
"... The problem of multiple comparisons can disappear when viewed from a Bayesian perspective. We propose building multilevel models in the settings where multiple comparisons arise. These address the multiple comparisons problem and also yield more efficient estimates, especially in settings with low g ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
The problem of multiple comparisons can disappear when viewed from a Bayesian perspective. We propose building multilevel models in the settings where multiple comparisons arise. These address the multiple comparisons problem and also yield more efficient estimates, especially in settings with low grouplevel variation, which is where multiple comparisons are a particular concern. Multilevel models perform partial pooling (shifting estimates toward each other), whereas classical procedures typically keep the centers of intervals stationary, adjusting for multiple comparisons by making the intervals wider (or, equivalently, adjusting the pvalues corresponding to intervals of fixed width). Multilevel estimates make comparisons more conservative, in the sense that intervals for comparisons are less likely to include zero; as a result, those comparisons that are made with confidence are more likely to be valid.
Default prior distributions and efficient posterior computation in Bayesian factor analysis
 Journal of Computational and Graphical Statistics
, 2009
"... Factor analytic models are widely used in social sciences. These models have also proven useful for sparse modeling of the covariance structure in multidimensional data. Normal prior distributions for factor loadings and inverse gamma prior distributions for residual variances are a popular choice b ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
Factor analytic models are widely used in social sciences. These models have also proven useful for sparse modeling of the covariance structure in multidimensional data. Normal prior distributions for factor loadings and inverse gamma prior distributions for residual variances are a popular choice because of their conditionally conjugate form. However, such prior distributions require elicitation of many hyperparameters and tend to result in poorly behaved Gibbs samplers. In addition, one must choose an informative specification, as high variance prior distributions face problems due to impropriety of the posterior distribution. This article proposes a default, heavy tailed prior distribution specification, which is induced through parameter expansion while facilitating efficient posterior computation. We also develop an approach to allow uncertainty in the number of factors. The methods are illustrated through simulated examples and epidemiology and toxicology applications.
Handling sparsity via the horseshoe
 Journal of Machine Learning Research, W&CP
"... This paper presents a general, fully Bayesian framework for sparse supervisedlearning problems based on the horseshoe prior. The horseshoe prior is a member of the family of multivariate scale mixtures of normals, and is therefore closely related to widely used approaches for sparse Bayesian learni ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
This paper presents a general, fully Bayesian framework for sparse supervisedlearning problems based on the horseshoe prior. The horseshoe prior is a member of the family of multivariate scale mixtures of normals, and is therefore closely related to widely used approaches for sparse Bayesian learning, including, among others, Laplacian priors (e.g. the LASSO) and Studentt priors (e.g. the relevance vector machine). The advantages of the horseshoe are its robustness at handling unknown sparsity and large outlying signals. These properties are justified theoretically via a representation theorem and accompanied by comprehensive empirical experiments that compare its performance to benchmark alternatives. 1