Results 1  10
of
49,549
PACBayes risk bounds for samplecompressed Gibbs classifiers
 Proceedings of the 22nth International Conference on Machine Learning (ICML 2005
, 2005
"... We extend the PACBayes theorem to the samplecompression setting where each classifier is represented by two independent sources of information: a compression set which consists of a small subset of the training data, and a message string of the additional information needed to obtain a classifier. ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
. The new bound is obtained by using a prior over a dataindependent set of objects where each object gives a classifier only when the training data is provided. The new PACBayes theorem states that a Gibbs classifier defined on a posterior over samplecompressed classifiers can have a smaller risk bound
PACBayes Risk Bounds for Stochastic Averages and Majority Votes of SampleCompressed Classifiers
, 2007
"... We propose a PACBayes theorem for the samplecompression setting where each classifier is described by a compression subset of the training data and a message string of additional information. This setting, which is the appropriate one to describe many learning algorithms, strictly generalizes the ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
We propose a PACBayes theorem for the samplecompression setting where each classifier is described by a compression subset of the training data and a message string of additional information. This setting, which is the appropriate one to describe many learning algorithms, strictly generalizes
PACBayes bounds for the risk of the majority vote and the variance of the Gibbs classifier
 In Neural Information Processing Systems (NIPS
, 2006
"... We propose new PACBayes bounds for the risk of the weighted majority vote that depend on the mean and variance of the error of its associated Gibbs classifier. We show that these bounds can be smaller than the risk of the Gibbs classifier and can be arbitrarily close to zero even if the risk of the ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
We propose new PACBayes bounds for the risk of the weighted majority vote that depend on the mean and variance of the error of its associated Gibbs classifier. We show that these bounds can be smaller than the risk of the Gibbs classifier and can be arbitrarily close to zero even if the risk
A PACBayes Sample Compression Approach to Kernel Methods
"... We propose a PACBayes sample compression approach to kernel methods that can accommodate any bounded similarity function and show that the support vector machine (SVM) classifier is a particular case of a more general class of datadependent classifiers known as majority votes of samplecompressed c ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We propose a PACBayes sample compression approach to kernel methods that can accommodate any bounded similarity function and show that the support vector machine (SVM) classifier is a particular case of a more general class of datadependent classifiers known as majority votes of samplecompressed
From PACBayes bounds to KL regularization
 Advances in Neural Information Processing Systems 22
, 2009
"... We show that convex KLregularized objective functions are obtained from a PACBayes risk bound when using convex loss functions for the stochastic Gibbs classifier that upperbound the standard zeroone loss used for the weighted majority vote. By restricting ourselves to a class of posteriors, tha ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We show that convex KLregularized objective functions are obtained from a PACBayes risk bound when using convex loss functions for the stochastic Gibbs classifier that upperbound the standard zeroone loss used for the weighted majority vote. By restricting ourselves to a class of posteriors
From PACBayes Bounds to Quadratic Programs for Majority Votes
"... We propose to construct a weighted majority vote on a set of basis functions by minimizing a risk bound (called the Cbound) that depends on the first two moments of the margin of the Qconvex combination realized on the data. This bound minimization algorithm turns out to be a quadratic program tha ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
that can be efficiently solved. A first version of the algorithm is designed for the supervised inductive setting and turns out to be very competitive with AdaBoost, MDBoost and the SVM. The second version is designed for the transductive setting. It competes well against TSVM. We also propose a new PACBayes
PACBayes Generalization Bounds for Randomized Structured Prediction
"... We present a new PACBayes generalization bound for structured prediction that is applicable to perturbationbased probabilistic models. Our analysis explores the relationship between perturbationbased modeling and the PACBayes framework, and connects to recently introduced generalization bounds f ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We present a new PACBayes generalization bound for structured prediction that is applicable to perturbationbased probabilistic models. Our analysis explores the relationship between perturbationbased modeling and the PACBayes framework, and connects to recently introduced generalization bounds
A pacbayes risk bound for general loss functions
 Advances in Neural Information Processing Systems 19
, 2007
"... We provide a PACBayesian bound for the expected loss of convex combinations of classifiers under a wide class of loss functions (which includes the exponential loss and the logistic loss). Our numerical experiments with Adaboost indicate that the proposed upper bound, computed on the training set, ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
of research, known as the “PACBayes theorem”, provides a tight upper bound on the risk of a stochastic classifier (defined on the posterior Q) called the Gibbs classifier. In the context of binary classification, the Qweighted majority vote classifier (related to this stochastic
Results 1  10
of
49,549