Results 1 
1 of
1
Structure learning of antiferromagnetic Ising models
 In NIPS
, 2014
"... In this paper we investigate the computational complexity of learning the graph structure underlying a discrete undirected graphical model from i.i.d. samples. Our first result is an unconditional computational lower bound of (pd/2) for learning general graphical models on p nodes of maximum degree ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
In this paper we investigate the computational complexity of learning the graph structure underlying a discrete undirected graphical model from i.i.d. samples. Our first result is an unconditional computational lower bound of (pd/2) for learning general graphical models on p nodes of maximum degree d, for the class of socalled statistical algorithms recently introduced by Feldman et al. [1]. The construction is related to the notoriously dicult learning parities with noise problem in computational learning theory. Our lower bound suggests that the ÂO(pd+2) runtime required by Bresler, Mossel, and Sly’s [2] exhaustivesearch algorithm cannot be significantly improved without restricting the class of models. Aside from structural assumptions on the graph such as it being a tree, hypertree, treelike, etc., many recent papers on structure learning assume that the model has the correlation decay property. Indeed, focusing on ferromagnetic Ising models, Bento and Montanari [3] showed that all known lowcomplexity algorithms fail to learn simple graphs when the interaction strength exceeds a number related to the correlation decay threshold. Our second set of results gives a class of repelling (antiferromagnetic) models that have the opposite behavior: very strong interaction allows ecient learning in time ÂO(p2). We provide an algorithm whose performance interpolates between ÂO(p2) and ÂO(pd+2) depending on the strength of the repulsion. 1