Results 1  10
of
250,392
A PACBayesian Approach for Domain Adaptation with Specialization to Linear Classifiers
, 2013
"... We provide a first PACBayesian analysis for domain adaptation (DA) which arises when the learning and test distributions differ. It relies on a novel distribution pseudodistance based on a disagreement averaging. Using this measure, we derive a PACBayesian DA bound for the stochastic Gibbs classif ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We provide a first PACBayesian analysis for domain adaptation (DA) which arises when the learning and test distributions differ. It relies on a novel distribution pseudodistance based on a disagreement averaging. Using this measure, we derive a PACBayesian DA bound for the stochastic Gibbs
PACBayesian Collective Stability
"... Recent results have shown that the generalization error of structured predictors decreases with both the number of examples and the size of each example, provided the data distribution has weak dependence and the predictor exhibits a smoothness property called collective stability. These results u ..."
Abstract
 Add to MetaCart
use an especially strong definition of collective stability that must hold uniformly over all inputs and all hypotheses in the class. We investigate whether weaker definitions of collective stability suffice. Using the PACBayes framework, which is particularly amenable to our new definitions, we
PACBayesian Learning of Linear Classifiers
"... We present a general PACBayes theorem from which all known PACBayes risk bounds are obtained as particular cases. We also propose different learning algorithms for finding linear classifiers that minimize these bounds. These learning algorithms are generally competitive with both AdaBoost and the ..."
Abstract

Cited by 59 (8 self)
 Add to MetaCart
We present a general PACBayes theorem from which all known PACBayes risk bounds are obtained as particular cases. We also propose different learning algorithms for finding linear classifiers that minimize these bounds. These learning algorithms are generally competitive with both Ada
Pacbayesian inequalities for martingales
 IEEE Transactions on Information Theory
, 2012
"... Abstract—We present a set of highprobability inequalities that control the concentration of weighted averages of multiple (possibly uncountably many) simultaneously evolving and interdependent martingales. Our results extend the PACBayesian (probably approximately correct) analysis in learning the ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
Abstract—We present a set of highprobability inequalities that control the concentration of weighted averages of multiple (possibly uncountably many) simultaneously evolving and interdependent martingales. Our results extend the PACBayesian (probably approximately correct) analysis in learning
PACBAYESIAN INDUCTIVE AND TRANSDUCTIVE LEARNING
, 2006
"... Abstract: We present here a PACBayesian point of view on adaptive supervised classification. Using convex analysis on the set of posterior probability measures on the parameter space, we show how to get local measures of the complexity of the classification model involving the relative entropy of p ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract: We present here a PACBayesian point of view on adaptive supervised classification. Using convex analysis on the set of posterior probability measures on the parameter space, we show how to get local measures of the complexity of the classification model involving the relative entropy
Supplementary Material to A PACBayesian Approach for Domain Adaptation with Specialization to Linear Classifiers
"... In this document, Section 1 contains some lemmas used in subsequent proofs, Section 2 presents an extended proof of the bound on the domain disagreement disρ(DS, DT) (Theorem 3 of the main paper), Section 3 introduces other PACBayesian bounds for disρ(DS, DT) and RPT (Gρ), Section 4 shows equations ..."
Abstract
 Add to MetaCart
In this document, Section 1 contains some lemmas used in subsequent proofs, Section 2 presents an extended proof of the bound on the domain disagreement disρ(DS, DT) (Theorem 3 of the main paper), Section 3 introduces other PACBayesian bounds for disρ(DS, DT) and RPT (Gρ), Section 4 shows
Bayesian Network Classifiers
, 1997
"... Recent work in supervised learning has shown that a surprisingly simple Bayesian classifier with strong assumptions of independence among features, called naive Bayes, is competitive with stateoftheart classifiers such as C4.5. This fact raises the question of whether a classifier with less restr ..."
Abstract

Cited by 788 (23 self)
 Add to MetaCart
Recent work in supervised learning has shown that a surprisingly simple Bayesian classifier with strong assumptions of independence among features, called naive Bayes, is competitive with stateoftheart classifiers such as C4.5. This fact raises the question of whether a classifier with less
Estimating Continuous Distributions in Bayesian Classifiers
 In Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence
, 1995
"... When modeling a probability distribution with a Bayesian network, we are faced with the problem of how to handle continuous variables. Most previous work has either solved the problem by discretizing, or assumed that the data are generated by a single Gaussian. In this paper we abandon the normality ..."
Abstract

Cited by 489 (2 self)
 Add to MetaCart
the normality assumption and instead use statistical methods for nonparametric density estimation. For a naive Bayesian classifier, we present experimental results on a variety of natural and artificial domains, comparing two methods of density estimation: assuming normality and modeling each conditional
Results 1  10
of
250,392