Results 1  10
of
96
Multicolumn deep neural networks for image classification
 IN PROCEEDINGS OF THE 25TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2012
, 2012
"... Traditional methods of computer vision and machine learning cannot match human performance on tasks such as the recognition of handwritten digits or traffic signs. Our biologically plausible deep artificial neural network architectures can. Small (often minimal) receptive fields of convolutional win ..."
Abstract

Cited by 151 (9 self)
 Add to MetaCart
Traditional methods of computer vision and machine learning cannot match human performance on tasks such as the recognition of handwritten digits or traffic signs. Our biologically plausible deep artificial neural network architectures can. Small (often minimal) receptive fields of convolutional
Largescale deep unsupervised learning using graphics processors
 International Conf. on Machine Learning
, 2009
"... The promise of unsupervised learning methods lies in their potential to use vast amounts of unlabeled data to learn complex, highly nonlinear models with millions of free parameters. We consider two wellknown unsupervised learning models, deep belief networks (DBNs) and sparse coding, that have rec ..."
Abstract

Cited by 51 (8 self)
 Add to MetaCart
examples. In this paper, we suggest massively parallel methods to help resolve these problems. We argue that modern graphics processors far surpass the computational capabilities of multicore CPUs, and have the potential to revolutionize the applicability of deep unsupervised learning methods. We develop
Learning the Structure of Deep Sparse Graphical Models
"... Deep belief networks are a powerful way to model complex probability distributions. However, it is difficult to learn the structure of a belief network, particularly one with hidden units. The Indian buffet process has been used as a nonparametric Bayesian prior on the structure of a directed belief ..."
Abstract
 Add to MetaCart
Deep belief networks are a powerful way to model complex probability distributions. However, it is difficult to learn the structure of a belief network, particularly one with hidden units. The Indian buffet process has been used as a nonparametric Bayesian prior on the structure of a directed
On the Quantitative Analysis of Deep Belief Networks
"... Deep Belief Networks (DBN’s) are generative models that contain many layers of hidden variables. Efficient greedy algorithms for learning and approximate inference have allowed these models to be applied successfully in many application domains. The main building block of a DBN is a bipartite undire ..."
Abstract

Cited by 84 (17 self)
 Add to MetaCart
undirected graphical model called a restricted Boltzmann machine (RBM). Due to the presence of the partition function, model selection, complexity control, and exact maximum likelihood learning in RBM’s are intractable. We show that Annealed Importance Sampling (AIS) can be used to efficiently estimate
SumProduct Networks: A New Deep Architecture
"... The key limiting factor in graphical model inference and learning is the complexity of the partition function. We thus ask the question: what are general conditions under which the partition function is tractable? The answer leads to a new kind of deep architecture, which we call sumproduct networks ..."
Abstract

Cited by 73 (10 self)
 Add to MetaCart
The key limiting factor in graphical model inference and learning is the complexity of the partition function. We thus ask the question: what are general conditions under which the partition function is tractable? The answer leads to a new kind of deep architecture, which we call sumproduct
Learning Deep Inference Machines
"... Introduction. The traditional approach to structured prediction problems is to craft a graphical model structure, learn parameters for the model, and perform inference using an efficient – and usually approximate– inference approach, including, e.g., graph cut methods, belief propagation, and variat ..."
Abstract
 Add to MetaCart
Introduction. The traditional approach to structured prediction problems is to craft a graphical model structure, learn parameters for the model, and perform inference using an efficient – and usually approximate– inference approach, including, e.g., graph cut methods, belief propagation
Deep Learning on GPUs with Theano
 In The LearningWorkshop,2010. Abstract
"... Since the introduction of Deep Belief Networks (Hinton et al., 2006), a surge of deep learning approaches have achieved stateoftheart performance in natural language tasks, audio classification, and image classification and demonstrated the advantage of using graphics hardware (GPUs) for computat ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Since the introduction of Deep Belief Networks (Hinton et al., 2006), a surge of deep learning approaches have achieved stateoftheart performance in natural language tasks, audio classification, and image classification and demonstrated the advantage of using graphics hardware (GPUs
SumProduct Networks for Deep Learning
"... The key limiting factor in graphical model inference and learning is the complexity of the partition function. We thus ask the question: what are the most general conditions under which the partition function is tractable? The answer leads to a new kind of deep architecture, which we call sumproduct ..."
Abstract
 Add to MetaCart
The key limiting factor in graphical model inference and learning is the complexity of the partition function. We thus ask the question: what are the most general conditions under which the partition function is tractable? The answer leads to a new kind of deep architecture, which we call
Learning and Evaluaing Deep Bolztmann Machines
"... Building intelligent systems that are capable of extracting highlevel representations from highdimensional sensory data lies at the core of solving many AI related tasks, including object recognition, speech perception, and language understanding. Theoretical and biological arguments strongly sugg ..."
Abstract
 Add to MetaCart
account for uncertainty when interpreting ambiguous sensory inputs. In this work, we present a new learning algorithms for a different type of hierarchical probabilistic model: a deep Boltzmann machine (DBM). Unlike deep belief networks, a DBM is a type of Markov random field, or undirected graphical
Results 1  10
of
96