Results 1  10
of
12
Learning Latent Tree Graphical Models
 J. of Machine Learning Research
, 2011
"... We study the problem of learning a latent tree graphical model where samples are available only from a subset of variables. We propose two consistent and computationally efficient algorithms for learning minimal latent trees, that is, trees without any redundant hidden nodes. Unlike many existing me ..."
Abstract

Cited by 46 (10 self)
 Add to MetaCart
We study the problem of learning a latent tree graphical model where samples are available only from a subset of variables. We propose two consistent and computationally efficient algorithms for learning minimal latent trees, that is, trees without any redundant hidden nodes. Unlike many existing methods, the observed nodes (or variables) are not constrained to be leaf nodes. Our algorithms can be applied to both discrete and Gaussian random variables and our learned models are such that all the observed and latent variables have the same domain (state space). Our first algorithm, recursive grouping, builds the latent tree recursively by identifying sibling groups using socalled information distances. One of the main contributions of this work is our second algorithm, which we refer to as CLGrouping. CLGrouping starts with a preprocessing procedure in which a tree over the observed variables is constructed. This global step groups the observed nodes that are likely to be close to each other in the true latent tree, thereby guiding subsequent recursive grouping (or equivalent procedures such as neighborjoining) on much smaller subsets of variables. This results in more accurate and efficient learning of latent trees. We also present regularized versions of our algorithms that learn latent tree approximations of arbitrary distributions. We compare
Reconstruction for models on random graphs
 In FOCS ’07: Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
, 2007
"... The reconstruction problem requires to estimate a random variable given ‘far away ’ observations. Several theoretical results (and simple algorithms) are available when the underlying probability distribution is Markov with respect to a tree. In this paper we estabilish several exact thresholds for ..."
Abstract

Cited by 30 (4 self)
 Add to MetaCart
(Show Context)
The reconstruction problem requires to estimate a random variable given ‘far away ’ observations. Several theoretical results (and simple algorithms) are available when the underlying probability distribution is Markov with respect to a tree. In this paper we estabilish several exact thresholds for loopy graphs. More precisely we consider models on random graphs that converge locally to trees. We establish the reconstruction thresholds for the Ising model both with attractive and random interactions (respectively, ‘ferromagnetic ’ and ‘spin glass’). Remarkably, in the first case the result does not coincide with the corresponding tree threshold. Among the other tools, we develop a sufficient condition for the tree and graph reconstruction problem to coincide. We apply such condition to antiferromagnetic colorings of random graphs. 1 Introduction and
Gibbs Measures and Phase Transitions on Sparse Random Graphs
"... Abstract: Many problems of interest in computer science and information theory can be phrased in terms of a probability distribution over discrete variables associated to the vertices of a large (but finite) sparse graph. In recent years, considerable progress has been achieved by viewing these dist ..."
Abstract

Cited by 27 (3 self)
 Add to MetaCart
(Show Context)
Abstract: Many problems of interest in computer science and information theory can be phrased in terms of a probability distribution over discrete variables associated to the vertices of a large (but finite) sparse graph. In recent years, considerable progress has been achieved by viewing these distributions as Gibbs measures and applying to their study heuristic tools from statistical physics. We review this approach and provide some results towards a rigorous treatment of these problems.
Topology Discovery of Sparse Random Graphs With Few Participants
"... We consider the task of topology discovery of sparse random graphs using endtoend random measurements (e.g., delay) between a subset of nodes, referred to as the participants. The rest of the nodes are hidden, and do not provide any information for topology discovery. We consider topology discover ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
We consider the task of topology discovery of sparse random graphs using endtoend random measurements (e.g., delay) between a subset of nodes, referred to as the participants. The rest of the nodes are hidden, and do not provide any information for topology discovery. We consider topology discovery under two routing models: (a) the participants exchange messages along the shortest paths and obtain endtoend measurements, and (b) additionally, the participants exchange messages along the second shortest path. For scenario(a), ourproposedalgorithm resultsinasublineareditdistance guarantee using a sublinear number of uniformly selected participants. For scenario (b), we obtain a much stronger result, and show that we can achieve consistent reconstruction when a sublinear numberof uniformly selected nodes participate. This implies that accurate discovery of sparse random graphs is tractable using an extremely small number of participants. We finally obtain a lower bound on the number of participants required by any algorithm to reconstruct the original random graph up to a given edit distance. We also demonstrate that while consistent discovery is tractable for sparse random graphs using a small number of participants, in general, there are graphs which cannot be discovered by any algorithm even with a significant number of participants, and with the availability of endtoend information along all the paths between the participants.
Topology Discovery of Sparse Random Graphs With Few Participants ∗
, 2011
"... We considerthe taskoftopologydiscoveryofsparserandomgraphsusing endtoendrandom measurements(e.g., delay)between a subset ofnodes, referredto as the participants. The rest of the nodes are hidden, and do not provide any information for topology discovery. We consider topology discovery under two ro ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
We considerthe taskoftopologydiscoveryofsparserandomgraphsusing endtoendrandom measurements(e.g., delay)between a subset ofnodes, referredto as the participants. The rest of the nodes are hidden, and do not provide any information for topology discovery. We consider topology discovery under two routing models: (a) the participants exchange messages along the shortest paths and obtain endtoend measurements, and (b) additionally, the participants exchange messages along the second shortest path. For scenario (a), our proposed algorithm results in a sublinear editdistance guarantee using a sublinear number of uniformly selected participants. For scenario (b), we obtain a much stronger result, and show that we can achieve consistent reconstruction when a sublinear number of uniformly selected nodes participate. This implies that accurate discovery of sparse random graphs is tractable using an extremely small number of participants. We finally obtain a lower bound on the number of participants required by any algorithm to reconstruct the original random graph up to a given edit distance. We also demonstrate that while consistent discovery is tractable for sparse random graphs using a small number of participants, in general, there are graphs which cannot be discovered by any algorithm even with a significant number of participants, and with the availability of endtoend information along all the paths between the participants.
Laboratory for Information and Decision Systems,
"... We study the problem of learning a latent tree graphical model where samples are available only from a subset of variables. We propose two consistent and computationally efficient algorithms for learning minimal latent trees, that is, trees without any redundant hidden nodes. Unlike many existing me ..."
Abstract
 Add to MetaCart
(Show Context)
We study the problem of learning a latent tree graphical model where samples are available only from a subset of variables. We propose two consistent and computationally efficient algorithms for learning minimal latent trees, that is, trees without any redundant hidden nodes. Unlike many existing methods, the observed nodes (or variables) are not constrained to be leaf nodes. Our algorithms can be applied to both discrete and Gaussian random variables and our learned models are such that all the observed and latent variables have the same domain (state space). Our first algorithm, recursive grouping, builds the latent tree recursively by identifying sibling groups using socalled information distances. One of the main contributions of this work is our second algorithm, which we refer to as CLGrouping. CLGrouping starts with a preprocessing procedure in which a tree over the observed variables is constructed. This global step groups the observed nodes that are likely to be close to each other in the true latent tree, thereby guiding subsequent recursive grouping (or equivalent procedures such as neighborjoining) on much smaller subsets of variables. This results in more accurate and efficient learning of latent trees. We
Consistent and Efficient Reconstruction of Latent Tree Models
"... Abstract—We study the problem of learning a latent tree graphical model where samples are available only from a subset of variables. We propose two consistent and computationally efficient algorithms for learning minimal latent trees, that is, trees without any redundant hidden nodes. Our first algo ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—We study the problem of learning a latent tree graphical model where samples are available only from a subset of variables. We propose two consistent and computationally efficient algorithms for learning minimal latent trees, that is, trees without any redundant hidden nodes. Our first algorithm, recursivegrouping,buildsthelatenttreerecursivelybyidentifying sibling groups. Our second and main algorithm, CLGrouping, starts with a preprocessing procedure in which a tree over the observed variables is constructed. This global step guides subsequent recursive grouping (or other latenttree learning procedures) on much smaller subsets of variables. This results in more accurate and efficient learning of latent trees. We compare the proposed algorithms to other methods by performing extensive numerical experiments on various latent tree graphical models such as hidden Markov models and star graphs. I.
Learning Latent Tree Graphical Models
, 2014
"... Terms of Use Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. Detailed Terms The MIT Faculty has made this article openly available. Please share how this access benefits you. ..."
Abstract
 Add to MetaCart
(Show Context)
Terms of Use Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. Detailed Terms The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters.
List of Tables...................................... vii
, 2011
"... The undersigned hereby certify that they have read and recommend to the Faculty of Graduate Studies for acceptance a thesis entitled “BLIND NETWORK TOMOGRAPHY” by Muhammad Hassan Raza in partial fulf llment of the requirements for the degree of Doctor of Philosophy. ..."
Abstract
 Add to MetaCart
(Show Context)
The undersigned hereby certify that they have read and recommend to the Faculty of Graduate Studies for acceptance a thesis entitled “BLIND NETWORK TOMOGRAPHY” by Muhammad Hassan Raza in partial fulf llment of the requirements for the degree of Doctor of Philosophy.