Results 1  10
of
11
Simulating Normalized Constants: From Importance Sampling to Bridge Sampling to Path Sampling
 Statistical Science, 13, 163–185. COMPARISON OF METHODS FOR COMPUTING BAYES FACTORS 435
, 1998
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract

Cited by 146 (4 self)
 Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
Simulating ratios of normalizing constants via a simple identity: A theoretical exploration
 Statistica Sinica
, 1996
"... Abstract: Let pi(w),i =1, 2, be two densities with common support where each density is known up to a normalizing constant: pi(w) =qi(w)/ci. We have draws from each density (e.g., via Markov chain Monte Carlo), and we want to use these draws to simulate the ratio of the normalizing constants, c1/c2. ..."
Abstract

Cited by 111 (4 self)
 Add to MetaCart
Abstract: Let pi(w),i =1, 2, be two densities with common support where each density is known up to a normalizing constant: pi(w) =qi(w)/ci. We have draws from each density (e.g., via Markov chain Monte Carlo), and we want to use these draws to simulate the ratio of the normalizing constants, c1/c2. Such a computational problem is often encountered in likelihood and Bayesian inference, and arises in fields such as physics and genetics. Many methods proposed in statistical and other literature (e.g., computational physics) for dealing with this problem are based on various special cases of the following simple identity: c1 c2 = E2[q1(w)α(w)] E1[q2(w)α(w)]. Here Ei denotes the expectation with respect to pi (i =1, 2), and α is an arbitrary function such that the denominator is nonzero. A main purpose of this paper is to provide a theoretical study of the usefulness of this identity, with focus on (asymptotically) optimal and practical choices of α. Using a simple but informative example, we demonstrate that with sensible (not necessarily optimal) choices of α, we can reduce the simulation error by orders of magnitude when compared to the conventional importance sampling method, which corresponds to α =1/q2. We also introduce several generalizations of this identity for handling more complicated settings (e.g., estimating several ratios simultaneously) and pose several open problems that appear to have practical as well as theoretical value. Furthermore, we discuss related theoretical and empirical work.
Warp bridge sampling
 J. Comp. Graph. Statist
, 2002
"... Bridge sampling, a general formulation of the acceptance ratio method in physics for computing freeenergy difference, is an effective Monte Carlo method for computing normalizingconstantsof probabilitymodels. The method was originallyproposedfor cases where the probabilitymodels have overlappingsup ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
Bridge sampling, a general formulation of the acceptance ratio method in physics for computing freeenergy difference, is an effective Monte Carlo method for computing normalizingconstantsof probabilitymodels. The method was originallyproposedfor cases where the probabilitymodels have overlappingsupport. Voter proposed the idea of shifting physical systems before applying the acceptance ratio method to calculate freeenergy differencesbetween systems that are highlyseparatedin a con � guration space.The purpose of this article is to push Voter’s idea further by applying more general transformations, including stochastic transformations resulting from mixing over transformation groups, to the underlying variables before performing bridge sampling. We term such methods warp bridgesampling to highlightthe fact that in addition to location shifting (i.e., centering)one can further reduce the difference/distance between two densities by warping their shapes without changing the normalizing constants. Real databased empirical studies using the fullinformationitem factor modeland a nonlinearmixed model are providedto demonstrate the potentially substantial gains in Monte Carlo ef � ciency by going beyond centering and by using ef � cient bridge sampling estimators. Our general method is also applicable to a couple of recent proposals for computing marginal likelihoods and Bayes factors because these methods turn out to be covered by the general bridge sampling framework.
MONTE CARLO LIKELIHOOD INFERENCE FOR MISSING DATA MODELS
"... We describe a Monte Carlo method to approximate the maximum likelihood estimate (MLE), when there are missing data and the observed data likelihood is not available in closed form. This method uses simulated missing data that are independent and identically distributed and independent of the observe ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
We describe a Monte Carlo method to approximate the maximum likelihood estimate (MLE), when there are missing data and the observed data likelihood is not available in closed form. This method uses simulated missing data that are independent and identically distributed and independent of the observed data. Our Monte Carlo approximation to the MLE is a consistent and asymptotically normal estimate of the minimizer θ ∗ of the KullbackLeibler information, as both Monte Carlo and observed data sample sizes go to infinity simultaneously. Plugin estimates of the asymptotic variance are provided for constructing confidence regions for θ ∗. We give LogitNormal generalized linear mixed model examples, calculated using an R package. AMS 2000 subject classifications. Primary 62F12; secondary 65C05. Key words and phrases. Asymptotic theory, Monte Carlo, maximum likelihood, generalized
Asthma And Allergic Diseases In Australian Twins And Their Families
, 1997
"... The occurrence of asthma or wheezing, and other allergic diseases, in 3808 pairs of twins aged 18 to 88 years was recorded by mailed questionnaire in 1980 (a pairwise response rate of 64%; individual, 69%). This sample (Cohort 1) was resurveyed in 1988 (78% pairwise followup), and a further 2159 pai ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The occurrence of asthma or wheezing, and other allergic diseases, in 3808 pairs of twins aged 18 to 88 years was recorded by mailed questionnaire in 1980 (a pairwise response rate of 64%; individual, 69%). This sample (Cohort 1) was resurveyed in 1988 (78% pairwise followup), and a further 2159 pairs aged 18 to 25 years (Cohort 2) responded usefully to a similar item on asthma on another instrument in 1989 sent to 4078 pairs (pairwise 53%). The crude cumulative incidence of wheezing was 13.2% in 1980, 18.9% in 1988, and 21.8% in 1989. Genetic analyses performed using this screening data suggested a strong genetic component to wheezing, hayfever and allergy, and sizeable genetic correlations between different atopic conditions. Genetic influences specific to particular traits such as wheezing were also detectable. A secular increase in incidence of wheeze experienced by consecutive birth cohorts seemed to be due to nonfamilial environmental factors. A more detailed respiratory symptoms...
NORMALIZING CONSTANTS
"... Abstract. Computing (ratios of) normalizing constants of probability models is a fundamental computational problem for many statistical and scientific studies. Monte Carlo simulation is an effective technique, especially with complex and highdimensional models. This paper aims to bring to the atten ..."
Abstract
 Add to MetaCart
Abstract. Computing (ratios of) normalizing constants of probability models is a fundamental computational problem for many statistical and scientific studies. Monte Carlo simulation is an effective technique, especially with complex and highdimensional models. This paper aims to bring to the attention of general statistical audiences of some effective methods originating from theoretical physics and at the same time to explore these methods from a more statistical perspective, through establishing theoretical connections and illustrating their uses with statistical problems. We show that the acceptance ratio method and thermodynamic integration are natural generalizations of importance sampling, which is most familiar to statistical audiences. The former generalizes importance sampling through the use of a single “bridge ” density and is thus a case of bridge sampling in the sense of Meng and Wong. Thermodynamic integration, which is also known in the numerical analysis literature as Ogata’s method for highdimensional integration, corresponds to the use of infinitely many and continuously connected bridges (and thus a “path”). Our path sampling formulation offers more flexibility and thus potential efficiency to thermodynamic integration, and the search of optimal paths turns out to have close connections with the Jeffreys prior density and the Rao and Hellinger distances between two densities. We provide an informative theoretical example as well as two empirical examples (involving 17 to 70dimensional integrations) to illustrate the potential and implementation of path sampling. We also discuss some open problems.
BMC Bioinformatics BioMed Central Methodology article
, 2006
"... A statistical score for assessing the quality of multiple sequence alignments ..."
Abstract
 Add to MetaCart
A statistical score for assessing the quality of multiple sequence alignments
Approved by: {<.c. ~
, 1981
"... (Under the direction of Robert C. Elston.) A multifactorial model for the segregation analysis of quantitative traits in pedigrees is presented. The model includes both polygenic and monogenic effects., The model also allows for two types of environmental correlation, a within sibship correlation an ..."
Abstract
 Add to MetaCart
(Under the direction of Robert C. Elston.) A multifactorial model for the segregation analysis of quantitative traits in pedigrees is presented. The model includes both polygenic and monogenic effects., The model also allows for two types of environmental correlation, a within sibship correlation and a within nuclear family correlation. Methods for the approximation of the true likelihood e for this model are presented. These models are derived from the methods for estimating the parameters of a mixture of normal distributions. The estimation methods are those equating moments, maximum likelihood and two methods of least squares estimation. One least squares method involves minimizing the sum of squared differences between the exact likelihood function and the approximation function; the