Results 1  10
of
913
Tighter PACBayes Bounds
, 2006
"... This paper proposes a PACBayes bound to measure the performance of Support Vector Machine (SVM) classifiers. The bound is based on learning a prior over the distribution of classifiers with a part of the training samples. Experimental work shows that this bound is tighter than the original PACBaye ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
This paper proposes a PACBayes bound to measure the performance of Support Vector Machine (SVM) classifiers. The bound is based on learning a prior over the distribution of classifiers with a part of the training samples. Experimental work shows that this bound is tighter than the original PACBayes
PACBayes Bounds with Data Dependent Priors
"... This paper presents the prior PACBayes bound and explores its capabilities as a tool to provide tight predictions of SVMs ’ generalization. The computation of the bound involves estimating a prior of the distribution of classifiers from the available data, and then manipulating this prior in the us ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
in the usual PACBayes generalization bound. We explore two alternatives: to learn the prior from a separate data set, or to consider an expectation prior that does not need this separate data set. The prior PACBayes bound motivates two SVMlike classification algorithms, prior SVM and ηprior SVM, whose
Data Dependent Priors in PACBayes Bounds
"... Abstract. One of the central aims of Statistical Learning Theory is the bounding of the test set performance of classifiers trained with i.i.d. data. For Support Vector Machines the tightest technique for assessing this socalled generalisation error is known as the PACBayes theorem. The bound hold ..."
Abstract
 Add to MetaCart
Abstract. One of the central aims of Statistical Learning Theory is the bounding of the test set performance of classifiers trained with i.i.d. data. For Support Vector Machines the tightest technique for assessing this socalled generalisation error is known as the PACBayes theorem. The bound
From PACBayes bounds to KL regularization
 Advances in Neural Information Processing Systems 22
, 2009
"... We show that convex KLregularized objective functions are obtained from a PACBayes risk bound when using convex loss functions for the stochastic Gibbs classifier that upperbound the standard zeroone loss used for the weighted majority vote. By restricting ourselves to a class of posteriors, tha ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
We show that convex KLregularized objective functions are obtained from a PACBayes risk bound when using convex loss functions for the stochastic Gibbs classifier that upperbound the standard zeroone loss used for the weighted majority vote. By restricting ourselves to a class of posteriors
Finding structure in time
 COGNITIVE SCIENCE
, 1990
"... Time underlies many interesting human behaviors. Thus, the question of how to represent time in connectionist models is very important. One approach is to represent time implicitly by its effects on processing rather than explicitly (as in a spatial representation). The current report develops a pro ..."
Abstract

Cited by 2071 (23 self)
 Add to MetaCart
of prior internal states. A set of simulations is reported which range from relatively simple problems (temporal version of XOR) to discovering syntactic/semantic features for words. The networks are able to learn interesting internal representations which incorporate task demands with memory demands
Relative absorptive capacity and interorganizational learning
 STRATEGIC MANAGEMENT JOURNAL
, 1998
"... Much of the prior research on interorganizational learning has focused on the role of absorptive capacity, a firm’s ability to value, assimilate, and utilize new external knowledge. However, this definition of the construct suggests that a firm has an equal capacity to learn from all other organizat ..."
Abstract

Cited by 463 (2 self)
 Add to MetaCart
Much of the prior research on interorganizational learning has focused on the role of absorptive capacity, a firm’s ability to value, assimilate, and utilize new external knowledge. However, this definition of the construct suggests that a firm has an equal capacity to learn from all other
Variational algorithms for approximate Bayesian inference
, 2003
"... The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coherent way, avoids overfitting problems, and provides a principled basis for selecting between alternative models. Unfortunately the computations required are usually intractable. This thesis presents ..."
Abstract

Cited by 440 (9 self)
 Add to MetaCart
The Bayesian framework for machine learning allows for the incorporation of prior knowledge in a coherent way, avoids overfitting problems, and provides a principled basis for selecting between alternative models. Unfortunately the computations required are usually intractable. This thesis presents
A PACBayes Sample Compression Approach to Kernel Methods
"... We propose a PACBayes sample compression approach to kernel methods that can accommodate any bounded similarity function and show that the support vector machine (SVM) classifier is a particular case of a more general class of datadependent classifiers known as majority votes of samplecompressed c ..."
Abstract
 Add to MetaCart
We propose a PACBayes sample compression approach to kernel methods that can accommodate any bounded similarity function and show that the support vector machine (SVM) classifier is a particular case of a more general class of datadependent classifiers known as majority votes of samplecompressed
Chromatic PACBayes Bounds for nonIID Data
 In AISTATS 09: JMLR Workshop and Conference Proceedings
, 2009
"... Abstract PacBayes bounds are among the tightest generalization bounds for classifiers learned from iid data, especially for margin classifiers. However, there are many practical cases where the training data show some dependencies and where the usual iid assumption does not hold. Stating generaliz ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract PacBayes bounds are among the tightest generalization bounds for classifiers learned from iid data, especially for margin classifiers. However, there are many practical cases where the training data show some dependencies and where the usual iid assumption does not hold. Stating
Dimensionality Dependent PACBayes Margin Bound
"... Margin is one of the most important concepts in machine learning. Previous margin bounds, both for SVM and for boosting, are dimensionality independent. A major advantage of this dimensionality independency is that it can explain the excellent performance of SVM whose feature spaces are often of hig ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
of high or infinite dimension. In this paper we address the problem whether such dimensionality independency is intrinsic for the margin bounds. We prove a dimensionality dependent PACBayes margin bound. The bound is monotone increasing with respect to the dimension when keeping all other factors fixed
Results 1  10
of
913