Results 1  10
of
415
Operations for Learning with Graphical Models
 Journal of Artificial Intelligence Research
, 1994
"... This paper is a multidisciplinary review of empirical, statistical learning from a graphical model perspective. Wellknown examples of graphical models include Bayesian networks, directed graphs representing a Markov chain, and undirected networks representing a Markov field. These graphical models ..."
Abstract

Cited by 276 (13 self)
 Add to MetaCart
are extended to model data analysis and empirical learning using the notation of plates. Graphical operations for simplifying and manipulating a problem are provided including decomposition, differentiation, and the manipulation of probability models from the exponential family. Two standard algorithm schemas
Maxmargin Markov networks
, 2003
"... In typical classification tasks, we seek a function which assigns a label to a single object. Kernelbased approaches, such as support vector machines (SVMs), which maximize the margin of confidence of the classifier, are the method of choice for many such tasks. Their popularity stems both from the ..."
Abstract

Cited by 604 (15 self)
 Add to MetaCart
independently to each object, losing much useful information. Conversely, probabilistic graphical models, such as Markov networks, can represent correlations between labels, by exploiting problem structure, but cannot handle highdimensional feature spaces, and lack strong theoretical generalization guarantees
Variational algorithms for approximate Bayesian inference
, 2003
"... The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coherent way, avoids overfitting problems, and provides a principled basis for selecting between alternative models. Unfortunately the computations required are usually intractable. This thesis presents ..."
Abstract

Cited by 440 (9 self)
 Add to MetaCart
a unified variational Bayesian (VB) framework which approximates these computations in models with latent variables using a lower bound on the marginal likelihood. Chapter 1 presents background material on Bayesian inference, graphical models, and propagation algorithms. Chapter 2 forms
Active Learning for Undirected Graphical Model Selection
, 2014
"... This paper studies graphical model selection, i.e., the problem of estimating a graph of statistical relationships among a collection of random variables. Conventional graphical model selection algorithms are passive, i.e., they require all the measurements to have been collected before processing b ..."
Abstract
 Add to MetaCart
This paper studies graphical model selection, i.e., the problem of estimating a graph of statistical relationships among a collection of random variables. Conventional graphical model selection algorithms are passive, i.e., they require all the measurements to have been collected before processing
Replicated softmax: an undirected topic model
 In Advances in Neural Information Processing Systems
"... We introduce a twolayer undirected graphical model, called a “Replicated Softmax”, that can be used to model and automatically extract lowdimensional latent semantic representations from a large unstructured collection of documents. We present efficient learning and inference algorithms for this m ..."
Abstract

Cited by 67 (14 self)
 Add to MetaCart
We introduce a twolayer undirected graphical model, called a “Replicated Softmax”, that can be used to model and automatically extract lowdimensional latent semantic representations from a large unstructured collection of documents. We present efficient learning and inference algorithms
EDML for Learning Parameters in Directed and Undirected Graphical Models
"... EDML is a recently proposed algorithm for learning parameters in Bayesian networks. It was originally derived in terms of approximate inference on a metanetwork which underlies the Bayesian approach to parameter estimation. While this initial derivation helped discover EDML in the first place and p ..."
Abstract
 Add to MetaCart
. The new perspective has several advantages. First, it makes immediate some results that were nontrivial to prove initially. Second, it facilitates the design of EDML algorithms for new graphical models, leading to a new algorithm for learning parameters in Markov networks. We derive this algorithm
Bayesian Learning in Undirected Graphical Models: Approximate MCMC algorithms
, 2004
"... Bayesian learning in undirected graphical models  computing posterior distributions over parameters and predictive quantities  is exceptionally difficult. We conjecture that for general undirected models, there are no tractable MCMC (Markov Chain Monte Carlo) schemes giving the correct equilib ..."
Abstract

Cited by 50 (2 self)
 Add to MetaCart
Bayesian learning in undirected graphical models  computing posterior distributions over parameters and predictive quantities  is exceptionally difficult. We conjecture that for general undirected models, there are no tractable MCMC (Markov Chain Monte Carlo) schemes giving the correct
A Junction Tree Framework for Undirected Graphical Model Selection
, 2013
"... An undirected graphical model is a joint probability distribution defined on an undirected graph G ∗, where the vertices in the graph index a collection of random variables and the edges encode conditional independence relationships amongst random variables. The undirected graphical model selection ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
An undirected graphical model is a joint probability distribution defined on an undirected graph G ∗, where the vertices in the graph index a collection of random variables and the edges encode conditional independence relationships amongst random variables. The undirected graphical model selection
The Bayes Net Toolbox for MATLAB
 Computing Science and Statistics
, 2001
"... The Bayes Net Toolbox (BNT) is an opensource Matlab package for directed graphical models. BNT supports many kinds of nodes (probability distributions), exact and approximate inference, parameter and structure learning, and static and dynamic models. BNT is widely used in teaching and research: the ..."
Abstract

Cited by 250 (1 self)
 Add to MetaCart
The Bayes Net Toolbox (BNT) is an opensource Matlab package for directed graphical models. BNT supports many kinds of nodes (probability distributions), exact and approximate inference, parameter and structure learning, and static and dynamic models. BNT is widely used in teaching and research
Protein Design by Sampling an Undirected Graphical Model of Residue Constraints
"... Protein engineering seeks to produce amino acid sequences with desired characteristics, such as specified structure [1] or function [4]. This is a difficult problem due to interactions among residues; choosing an amino acid type at one position may constrain the possibilities at others, in order for ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
evolving residues. Recently, Ranganathan and colleagues showed that accounting for residue coupling, in addition to conservation, was to some extent both necessary and sufficient for viability of new WW domains [6, 5]. We have previously developed an approach for learning an undirected graphical model encapsulating
Results 1  10
of
415