Results 1 
4 of
4
Risk bounds for Statistical Learning
"... We propose a general theorem providing upper bounds for the risk of an empirical risk minimizer (ERM).We essentially focus on the binary classi…cation framework. We extend Tsybakov’s analysis of the risk of an ERM under margin type conditions by using concentration inequalities for conveniently weig ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
We propose a general theorem providing upper bounds for the risk of an empirical risk minimizer (ERM).We essentially focus on the binary classi…cation framework. We extend Tsybakov’s analysis of the risk of an ERM under margin type conditions by using concentration inequalities for conveniently weighted empirical processes. This allows us to deal with other ways of measuring the ”size”of a class of classi…ers than entropy with bracketing as in Tsybakov’s work. In particular we derive new risk bounds for the ERM when the classi…cation rules belong to some VCclass under margin conditions and discuss the optimality of those bounds in a minimax sense.
Entropy and the combinatorial dimension
 Inventiones Mathematicae
, 2003
"... We solve Talagrand’s entropy problem: the L2covering numbers of every uniformly bounded class of functions are exponential in its shattering dimension. This extends Dudley’s theorem on classes of {0,1}valued functions, for which the shattering dimension is the VapnikChervonenkis dimension. In conv ..."
Abstract

Cited by 21 (13 self)
 Add to MetaCart
We solve Talagrand’s entropy problem: the L2covering numbers of every uniformly bounded class of functions are exponential in its shattering dimension. This extends Dudley’s theorem on classes of {0,1}valued functions, for which the shattering dimension is the VapnikChervonenkis dimension. In convex geometry, the solution means that the entropy of a convex body K is controlled by the maximal dimension of a cube of a fixed side contained in the coordinate projections of K. This has a number of consequences, including the optimal Elton’s Theorem and estimates on the uniform central limit theorem in the real valued case. 1
ENTROPY CONDITIONS FOR LrCONVERGENCE OF EMPIRICAL PROCESSES
"... Abstract. The Law of Large Numbers (LLN) over classes of functions is a classical topic of Empirical Processes Theory. The properties characterizing classes of functions on which the LLN holds uniformly (i.e. GlivenkoCantelli classes) have been widely studied in the literature. An elegant sufficien ..."
Abstract
 Add to MetaCart
Abstract. The Law of Large Numbers (LLN) over classes of functions is a classical topic of Empirical Processes Theory. The properties characterizing classes of functions on which the LLN holds uniformly (i.e. GlivenkoCantelli classes) have been widely studied in the literature. An elegant sufficient condition for such a property is finiteness of the KoltchinskiiPollard entropy integral, and other conditions have been formulated in terms of suitable combinatorial complexities (e.g. the VapnikChervonenkis dimension). In this paper, we endow the class of functions F with a probability measure and consider the LLN relative to the associated Lr metric. This framework extends the case of uniform convergence over F, which is recovered when r goes to infinity. The main result is a LrLLN in terms of a suitable uniform entropy integral which generalizes the KoltchinskiiPollard entropy integral. 1.