Results 1  10
of
22,279
Boltzmann machines
, 2007
"... A Boltzmann Machine is a network of symmetrically connected, neuronlike units that make stochastic decisions about whether to be on or off. Boltzmann machines have a simple learning algorithm that allows them to discover interesting features in datasets composed of binary vectors. The learning algor ..."
Abstract

Cited by 220 (21 self)
 Add to MetaCart
A Boltzmann Machine is a network of symmetrically connected, neuronlike units that make stochastic decisions about whether to be on or off. Boltzmann machines have a simple learning algorithm that allows them to discover interesting features in datasets composed of binary vectors. The learning
A learning algorithm for Boltzmann machines
 Cognitive Science
, 1985
"... The computotionol power of massively parallel networks of simple processing elements resides in the communication bandwidth provided by the hardware connections between elements. These connections con allow a significant fraction of the knowledge of the system to be applied to an instance of a probl ..."
Abstract

Cited by 586 (13 self)
 Add to MetaCart
The computotionol power of massively parallel networks of simple processing elements resides in the communication bandwidth provided by the hardware connections between elements. These connections con allow a significant fraction of the knowledge of the system to be applied to an instance of a problem in o very short time. One kind of computation for which massively porollel networks appear to be well suited is large constraint satisfaction searches, but to use the connections efficiently two conditions must be met: First, a search technique that is suitable for parallel networks must be found. Second, there must be some way of choosing internal representations which allow the preexisting hardware connections to be used efficiently for encoding the constraints in the domain being searched. We describe a generol parallel search method, based on statistical mechanics, and we show how it leads to a general learning rule for modifying the connection strengths so as to incorporate knowledge obout o task domain in on efficient way. We describe some simple examples in which the learning algorithm creates internal representations thot ore demonstrobly the most efficient way of using the preexisting connectivity structure. 1.
Boltzmann Machines
, 2011
"... In practice, training Restricted Boltzmann Machines with Contrastive Divergence and other approximate maximum likelihood methods works well on data with black backgrounds. However, when using inverted images for training, learning is typically much worse. In this paper, we propose a very simple yet ..."
Abstract
 Add to MetaCart
In practice, training Restricted Boltzmann Machines with Contrastive Divergence and other approximate maximum likelihood methods works well on data with black backgrounds. However, when using inverted images for training, learning is typically much worse. In this paper, we propose a very simple yet
HigherOrder Boltzmann Machines
 Neural Networks for Computing
, 1986
"... The Boltzmann machine is a nonlinear network of stochastic binary processing units that interact pairwise through symmetric connection strengths. In a thirdorder Boltzmann machine, triples of units interact through symmetric conjunctive interactions. The Boltzmann learning algorithm is generalized ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
The Boltzmann machine is a nonlinear network of stochastic binary processing units that interact pairwise through symmetric connection strengths. In a thirdorder Boltzmann machine, triples of units interact through symmetric conjunctive interactions. The Boltzmann learning algorithm is generalized
Restricted Boltzmann machines for collaborative filtering
 In Machine Learning, Proceedings of the Twentyfourth International Conference (ICML 2004). ACM
, 2007
"... Most of the existing approaches to collaborative filtering cannot handle very large data sets. In this paper we show how a class of twolayer undirected graphical models, called Restricted Boltzmann Machines (RBM’s), can be used to model tabular data, such as user’s ratings of movies. We present eff ..."
Abstract

Cited by 213 (13 self)
 Add to MetaCart
Most of the existing approaches to collaborative filtering cannot handle very large data sets. In this paper we show how a class of twolayer undirected graphical models, called Restricted Boltzmann Machines (RBM’s), can be used to model tabular data, such as user’s ratings of movies. We present
Learning and evaluating Boltzmann machines
, 2008
"... We provide a brief overview of the variational framework for obtaining deterministic approximations or upper bounds for the logpartition function. We also review some of the Monte Carlo based methods for estimating partition functions of arbitrary Markov Random Fields. We then develop an annealed i ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
importance sampling (AIS) procedure for estimating partition functions of restricted Boltzmann machines (RBM’s), semirestricted Boltzmann machines (SRBM’s), and Boltzmann machines (BM’s). Our empirical results indicate that the AIS procedure provides much better estimates of the partition function than some
Boltzmann Machines and the EM algorithm
"... In this paper we formulate the Expectation Maximization (EM) algorithm for Boltzmann Machines and we prove that the Kullback distance is a Lyaponov function for the EM algorithm. As a result the EM algorithm yields the same solutions as the original learning rule of Ackley, Hinton and Sejnowski. We ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In this paper we formulate the Expectation Maximization (EM) algorithm for Boltzmann Machines and we prove that the Kullback distance is a Lyaponov function for the EM algorithm. As a result the EM algorithm yields the same solutions as the original learning rule of Ackley, Hinton and Sejnowski. We
Large Margin Boltzmann Machines ∗
"... Boltzmann Machines are a powerful class of undirected graphical models. Originally proposed as artificial neural networks, they can be regarded as a type of Markov Random Field in which the connection weights between nodes are symmetric and learned from data. They are also closely related to recent ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Boltzmann Machines are a powerful class of undirected graphical models. Originally proposed as artificial neural networks, they can be regarded as a type of Markov Random Field in which the connection weights between nodes are symmetric and learned from data. They are also closely related to recent
Results 1  10
of
22,279