Results 1  10
of
26
Supplement to “Uncovering latent structure in valued graphs: A variational approach.” DOI
, 2010
"... ar ..."
Asymptotic behaviour of the posterior distribution in overfitted mixture models
, 2010
"... mixture models ..."
Variational Bayes for estimating the parameters of a hidden Potts model
 Stat. Comput
, 2009
"... Hidden Markov random field models provide an appealing representation of images and other spatial problems. The drawback is that inference is not straightforward for these models as the normalisation constant for the likelihood is generally intractable except for very small observation sets. Variati ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
Hidden Markov random field models provide an appealing representation of images and other spatial problems. The drawback is that inference is not straightforward for these models as the normalisation constant for the likelihood is generally intractable except for very small observation sets. Variational methods are an emerging tool for Bayesian inference and they have already been successfully applied in other contexts. Focusing on the particular case of a hidden Potts model with Gaussian noise, we show how variational Bayesian methods can be applied to hidden Markov random field inference. To tackle the obstacle of the intractable normalising constant for the likelihood, we explore alternative estimation approaches for incorporation into the variational Bayes algorithm. We consider a pseudolikelihood approach as well as the more recent reduced dependence approximation of the normalisation constant. To illustrate the effectiveness of these approaches we present empirical results from the analysis of simulated datasets. We also analyse a real dataset and compare results with those of previous analyses as well as those obtained from the recently developed auxiliary variable MCMC method and the recursive MCMC method. Our results show that the variational Bayesian analyses can be carried out much faster than the MCMC analyses and produce good estimates of model parameters. We also found that the reduced dependence approximation of the normalisation constant outperformed the pseudolikelihood approximation in our analysis of real and synthetic datasets.
Bayesian Ying Yang system, best harmony learning, and Gaussian manifold based family
 Computational Intelligence: Research Frontiers, WCCI2008 Plenary/Invited Lectures. Lecture Notes in Computer Science
"... five action circling ..."
(Show Context)
Variational Bayesian inference for parametric and nonparametric regression with missing data
 Journal of the American Statistical Association
, 2011
"... Bayesian hierarchical models are attractive structures for conducting regression analyses when the data are subject to missingness. However, the requisite probability calculus is challenging and Monte Carlo methods typically are employed. We develop an alternative approach based on deterministic ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
(Show Context)
Bayesian hierarchical models are attractive structures for conducting regression analyses when the data are subject to missingness. However, the requisite probability calculus is challenging and Monte Carlo methods typically are employed. We develop an alternative approach based on deterministic variational Bayes approximations. Both parametric and nonparametric regression are treated. We demonstrate that variational Bayes can achieve good accuracy, but with considerably less computational overhead. Themain ramification is fast approximate Bayesian inference in parametric and nonparametric regression models with missing data.
Variational Bayesian Analysis for Hidden Markov Models
"... The variational approach to Bayesian inference enables simultaneous estimation of model parameters and model complexity. An interesting feature of this approach is that it appears also to lead to an automatic choice of model complexity. Empirical results from the analysis of hidden Markov models wit ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
The variational approach to Bayesian inference enables simultaneous estimation of model parameters and model complexity. An interesting feature of this approach is that it appears also to lead to an automatic choice of model complexity. Empirical results from the analysis of hidden Markov models with Gaussian observation densities illustrate this. If the variational algorithm is initialised with a large number of hidden states, redundant states are eliminated as the method converges to a solution, thereby leading to an automatic selection of the number of hidden states. In addition, through the use of a variational approximation, the Deviance Information Criterion (DIC) for Bayesian model selection can be extended to the hidden Markov model framework. Calculation of the DIC provides a further tool for model selection which can be used in conjunction with the variational approach.
Bayesian modelling for biological pathway annotation of gene expression pathway signatures
, 2010
"... We present Bayesian models and computational methods for the problem of matching predictions from molecular studies with known biological pathway databases the problem of pathway annotation of summary results of an experiment or observational study. In areas such as cancer genomics, linking quantif ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
We present Bayesian models and computational methods for the problem of matching predictions from molecular studies with known biological pathway databases the problem of pathway annotation of summary results of an experiment or observational study. In areas such as cancer genomics, linking quantified, experimentally defined gene expression signatures with known biological pathway gene sets is essential to improving the understanding of the complexity of molecular pathways related to outcome. Our probabilistic pathway annotation (PROPA) analysis involves new models for formal assessment and rankings of pathways putatively linked to an experimental or observational phenotype, integrates qualitative biological information into the analysis, and generates coherent inferences on uncertainties about gene pathway membership that can inform the revision of pathway databases. Our analysis relies on simulationbased computation in highdimensional models, and introduces a novel extension of variational methods for computation of model evidence, or marginal likelihood functions, that are central to the comparison of multiple biological pathways. Examples highlight the methodology using both simulated and real data, and we develop detailed cases studies in breast cancer genomics involving hormonal pathways and pathway activities underlying cellular responses to lactic acidosis in breast cancer. The second study demonstrates the application of the method in decomposing the complexity of gene expressionbased predictions about interacting biological pathway activation from both experimental (in vitro) and observational (in vivo) human cancer data.
Mean field variational Bayes for continuous sparse signal shrinkage: Pitfalls and remedies. Electronic Journal of Statistics
, 2014
"... We investigate mean field variational approximate Bayesian inference for models that use continuous distributions, Horseshoe, NegativeExponentialGamma and Generalized Double Pareto, for sparse signal shrinkage. Our principal finding is that the most natural, and simplest, mean field variational B ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
We investigate mean field variational approximate Bayesian inference for models that use continuous distributions, Horseshoe, NegativeExponentialGamma and Generalized Double Pareto, for sparse signal shrinkage. Our principal finding is that the most natural, and simplest, mean field variational Bayes algorithm can perform quite poorly due to posterior dependence among auxiliary variables. More sophisticated algorithms, based on special functions, are shown to be superior. Continued fraction approximations via Lentz’s Algorithm are developed to make the algorithms practical.
Comment on: “Bayesian Computation Using Design of Experimentsbased Interpolation Technique”
, 2012
"... The author is to be commended on the development of this new piece of methodology, which they name DoIt. We believe that the method, or later versions of the method, has the potential to be an important element in the kitbag of nonMCMC based methods for approximate Bayesian inference. Throughout t ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
The author is to be commended on the development of this new piece of methodology, which they name DoIt. We believe that the method, or later versions of the method, has the potential to be an important element in the kitbag of nonMCMC based methods for approximate Bayesian inference. Throughout the article a number of criticisms have been leveled toward variational approximations (of which variational Bayes (VB) is a special case). As much of our recent research has been in this area we will focus our comments in defense of this methodology. As a basis for comparison between methods we adapt the criteria listed in Ruppert, Wand & Carroll (2003, Section 3.16), upon which scatterplot smoothers may be judged, to criteria for general methodology. 1. Convenience. Is it available on the analyst’s favorite computer package? 2. Implementability. If not immediately available, how easy is it to implement in the analyst’s favorite programming language? 3. Flexibility. Is the method able to handle a wide range of models? 4. Simplicity and Tractability. Is it easy to understand how the technique processes the data to obtain answers? Is it easy to analyze the mathematical properties of the technique?
Variational Bayes Approximations for Clustering via Mixtures of Normal Inverse Gaussian Distributions
"... Parameter estimation for modelbased clustering using a finite mixture of normal inverse Gaussian (NIG) distributions is achieved through variational Bayes approximations. Univariate NIG mixtures and multivariate NIG mixtures are considered. The use of variational Bayes approximations here is a sub ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Parameter estimation for modelbased clustering using a finite mixture of normal inverse Gaussian (NIG) distributions is achieved through variational Bayes approximations. Univariate NIG mixtures and multivariate NIG mixtures are considered. The use of variational Bayes approximations here is a substantial departure from the traditional EM approach and alleviates some of the associated computational complexities and uncertainties. Our variational algorithm is applied to simulated and real data. The paper concludes with discussion and suggestions for future work.