Results 1  10
of
132
Convex optimization techniques for fitting sparse gaussian graphical models
 In Proceedings of the 23rd International Conference on Machine Learning
, 2006
"... We consider the problem of fitting a largescale covariance matrix to multivariate Gaussian data in such a way that the inverse is sparse, thus providing model selection. Beginning with a dense empirical covariance matrix, we solve a maximum likelihood problem with an l1norm penalty term added to e ..."
Abstract

Cited by 64 (0 self)
 Add to MetaCart
We consider the problem of fitting a largescale covariance matrix to multivariate Gaussian data in such a way that the inverse is sparse, thus providing model selection. Beginning with a dense empirical covariance matrix, we solve a maximum likelihood problem with an l1norm penalty term added
Sparse Gaussian Graphical Models with Unknown Block Structure
"... Recent work has shown that one can learn the structure of Gaussian Graphical Models by imposing an L1 penalty on the precision matrix, and then using efficient convex optimization methods to find the penalized maximum likelihood estimate. This is similar to performing MAP estimation with a prior tha ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
Recent work has shown that one can learn the structure of Gaussian Graphical Models by imposing an L1 penalty on the precision matrix, and then using efficient convex optimization methods to find the penalized maximum likelihood estimate. This is similar to performing MAP estimation with a prior
Sparse and Locally Constant Gaussian Graphical Models
"... Locality information is crucial in datasets where each variable corresponds to a measurement in a manifold (silhouettes, motion trajectories, 2D and 3D images). Although these datasets are typically undersampled and highdimensional, they often need to be represented with lowcomplexity statistical ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
class of Gaussian graphical models which, together with sparseness, imposes local constancy through ℓ1norm penalization. Second, we propose an efficient algorithm which decomposes the strictly convex maximum likelihood estimation into a sequence of problems with closed form solutions. Through synthetic
Sparse Gaussian Graphical Models for Speech Recognition
"... We address the problem of learning the structure of Gaussian graphical models for use in automatic speech recognition, a means of controlling the form of the inverse covariance matrices of such systems. With particular focus on data sparsity issues, we implement a method for imposing graphical model ..."
Abstract
 Add to MetaCart
model structure on a Gaussian mixture system, using a convex optimisation technique to maximise a penalised likelihood expression. The results of initial experiments on a phone recognition task show a performance improvement over an equivalent fullcovariance system. Index Terms: speech recognition
Variable Selection for Gaussian Graphical Models
, 2012
"... We present a variableselection structure learning approach for Gaussian graphical models. Unlike standard sparseness promoting techniques, our method aims at selecting the mostimportant variables besides simply sparsifying the set of edges. Through simulations, we show that our method outperforms ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We present a variableselection structure learning approach for Gaussian graphical models. Unlike standard sparseness promoting techniques, our method aims at selecting the mostimportant variables besides simply sparsifying the set of edges. Through simulations, we show that our method outperforms
Latent Variable Graphical Model Selection via Convex Optimization
, 2010
"... Suppose we have samples of a subset of a collection of random variables. No additional information is provided about the number of latent variables, nor of the relationship between the latent and observed variables. Is it possible to discover the number of hidden components, and to learn a statistic ..."
Abstract

Cited by 76 (4 self)
 Add to MetaCart
step we give natural conditions under which such latentvariable Gaussian graphical models are identifiable given marginal statistics of only the observed variables. Essentially these conditions require that the conditional graphical model among the observed variables is sparse, while the effect
On Sparse Gaussian Chain Graph Models
"... Abstract In this paper, we address the problem of learning the structure of Gaussian chain graph models in a highdimensional space. Chain graph models are generalizations of undirected and directed graphical models that contain a mixed set of directed and undirected edges. While the problem of spa ..."
Abstract
 Add to MetaCart
of sparse structure learning has been studied extensively for Gaussian graphical models and more recently for conditional Gaussian graphical models (CGGMs), there has been little previous work on the structure recovery of Gaussian chain graph models. We consider linear regression models and a re
Discussion of “Latent Variable Graphical Model Selection via Convex Optimization”
"... We wish to congratulate the authors for their innovative contribution, which is bound to inspire much further research. We find latent variable model selection to be a fantastic application of matrix decomposition methods, namely, the superposition of lowrank and sparse elements. Clearly, the metho ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
of the graphical lasso of Yuan and Lin [15], see also [1, 6], which is a popular approach for learning the structure in an undirected Gaussian graphical model. In this setup, we assume we have independent samples X ∼ N (0, ⌃) with a covariance matrix ⌃ exhibiting a sparse dependence structure but otherwise unknown
DISCUSSION: LATENT VARIABLE GRAPHICAL MODEL SELECTION VIA CONVEX OPTIMIZATION
, 2012
"... We want to congratulate the authors for a thoughtprovoking and very interesting paper. Sparse modeling of the concentration matrix has enjoyed popularity in recent years. It has been framed as a computationally convenient convex 1constrained estimation problem in Yuan and Lin (2007) and can be ap ..."
Abstract
 Add to MetaCart
that it is formulated as a convex optimization problem. Latent variable fitting and sparse graphical modeling of the conditional distribution of the observed variables can then be obtained through a single fitting procedure. Practical aspects. The method deserves wide adoption, but this will only be realistic
Simultaneous and groupsparse multitask learning of gaussian graphical models
, 2012
"... In this paper, we present `1,p multitask structure learning for Gaussian graphical models. We discuss the uniqueness and boundedness of the optimal solution of the maximization problem. A block coordinate descent method leads to a provably convergent algorithm that generates a sequence of positive ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this paper, we present `1,p multitask structure learning for Gaussian graphical models. We discuss the uniqueness and boundedness of the optimal solution of the maximization problem. A block coordinate descent method leads to a provably convergent algorithm that generates a sequence
Results 1  10
of
132