Results 1  10
of
104
Generalization Performance of Regularization Networks and Support . . .
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 2001
"... We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint that is apparently novel in the field of statistical learning theory. The hy ..."
Abstract

Cited by 72 (18 self)
 Add to MetaCart
(Show Context)
We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint that is apparently novel in the field of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly infinitedimensional unit ball in feature space into a finitedimensional space. The covering numbers of the class are then determined via the entropy numbers of the operator. These numbers, which characterize the degree of compactness of the operator, can be bounded in terms of the eigenvalues of an integral operator induced by the kernel function used by the machine. As a consequence, we are able to theoretically explain the effect of the choice of kernel function on the generalization performance of support vector machines.
Fast rates for support vector machines using gaussian kernels
 Ann. Statist
, 2004
"... We establish learning rates up to the order of n −1 for support vector machines with hinge loss (L1SVMs) and nontrivial distributions. For the stochastic analysis of these algorithms we use recently developed concepts such as Tsybakov’s noise assumption and local Rademacher averages. Furthermore we ..."
Abstract

Cited by 52 (7 self)
 Add to MetaCart
(Show Context)
We establish learning rates up to the order of n −1 for support vector machines with hinge loss (L1SVMs) and nontrivial distributions. For the stochastic analysis of these algorithms we use recently developed concepts such as Tsybakov’s noise assumption and local Rademacher averages. Furthermore we introduce a new geometric noise condition for distributions that is used to bound the approximation error of Gaussian kernels in terms of their widths. 1
Analytic Methods for Simulated Light Transport
, 1995
"... this dissertation was conducted. Special thanks to Erin Shaw, Steve Westin, Pete Shirley, and John Hughes for carefully reading various portions of this work and offering detailed comments. Many thanks to my coauthors Julie Dorsey, Dave Kirk, Kevin Novins, David Salesin, Francois Sillion, Brian Sini ..."
Abstract

Cited by 39 (9 self)
 Add to MetaCart
this dissertation was conducted. Special thanks to Erin Shaw, Steve Westin, Pete Shirley, and John Hughes for carefully reading various portions of this work and offering detailed comments. Many thanks to my coauthors Julie Dorsey, Dave Kirk, Kevin Novins, David Salesin, Francois Sillion, Brian Sinits, Ken Torrance, and Steve Westin, from whom I have learned so much over the years, and to Pete Shirley, Dani Lischinski, Bill Gropp, and Jim Ferwerda for enumerable stimulating vi vii discussions. Thanks also to Ben Trumbore and Albert Dicruttalo for modeling and software support, to Dan Kartch for all the help with document preparation, to Jonathan CorsonRikert, Ellen French, Linda Stephens, and Judy Smith for admin istrative support, and to Hurf Sheldon for many years of cheerful and professional systems support. From my days at Apollo Computer, I'd like to thank A1 Lopez, Fabio Pettinati, Ken Severson, and Terry Lindgren for all their encouragement. Many fellow students and assorted friends have also helped and inspired me along the way, including Lenny Pitt, Mukesh Prasad, Michael Monks, Ken Musgrave, Andrew Glassner, Mimi and Noel Mateo, and Susan Vonderheide
On Weighted Hilbert Spaces and Integration of Functions of Infinitely Many Variables
, 2012
"... The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distrib ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distributed by the authors.
Hyperplane conjecture for quotient spaces of Lp
 Forum Math
, 1994
"... We give a positive solution for the hyperplane conjecture of quotient spaces F of Lp, where 1 < p ≤ ∞. vol(BF) n−1 n ≤ c0 p ′ sup H hyperplane vol(BF ∩ H). This result is extended to Banach lattices which does not contain ℓn 1 ’s uniformly. Our main tools are tensor products and minimal volume ra ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
We give a positive solution for the hyperplane conjecture of quotient spaces F of Lp, where 1 < p ≤ ∞. vol(BF) n−1 n ≤ c0 p ′ sup H hyperplane vol(BF ∩ H). This result is extended to Banach lattices which does not contain ℓn 1 ’s uniformly. Our main tools are tensor products and minimal volume ratio with respect to Lpsections. An open problem in the theory of convex sets is the so called Hyperplane problem: Does there exist a universal constant c> 0 such that for all n ∈ IN and all convex, symmetric bodies K ⊂ IR n one has K  n−1
Entropy Numbers, Operators and Support Vector Kernels
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 1998
"... We derive new bounds for the generalization error of feature space machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs are based on a viewpoint that is apparently novel in the field of statistical learning theory ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
(Show Context)
We derive new bounds for the generalization error of feature space machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs are based on a viewpoint that is apparently novel in the field of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly infinite dimensional unit ball in feature space into a finite dimensional space. The covering numbers of the class are then determined via the entropy numbers of the operator. These numbers, which characterize the degree of compactness of the operator, can be bounded in terms of the eigenvalues of an integral operator induced by the kernel function used by the machine. As a consequence we are able to theoretically explain the effect of the choice of kernel functions on the generalization performance of support vector machines.
An extension of Milman’s reverse Brunn– Minkowski inequality, GAFA 5:3
, 1995
"... The classical BrunnMinkowski inequality states that for A1, A2 ⊂ IR n compact, ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
(Show Context)
The classical BrunnMinkowski inequality states that for A1, A2 ⊂ IR n compact,
Complex interpolation between Hilbert, Banach and operator spaces
, 2008
"... Motivated by a question of Vincent Lafforgue, we study the Banach spaces X satisfying the following property: there is a function ε → ∆X(ε) tending to zero with ε> 0 such that every operator T: L2 → L2 with ‖T ‖ ≤ ε that is simultaneously contractive (i.e. of norm ≤ 1) on L1 and on L ∞ must be ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
Motivated by a question of Vincent Lafforgue, we study the Banach spaces X satisfying the following property: there is a function ε → ∆X(ε) tending to zero with ε> 0 such that every operator T: L2 → L2 with ‖T ‖ ≤ ε that is simultaneously contractive (i.e. of norm ≤ 1) on L1 and on L ∞ must be of norm ≤ ∆X(ε) on L2(X). We show that ∆X(ε) ∈ O(ε α) for some α> 0 iff X is isomorphic to a quotient of a subspace of an ultraproduct of θHilbertian spaces for some θ> 0 (see Corollary 6.7), where θHilbertian is meant in a slightly more general sense than in our previous paper [43]. Let Br(L2(µ)) be the space of all regular operators on L2(µ). We are able to describe the complex interpolation space (Br(L2(µ)),B(L2(µ))) θ. We show that T: L2(µ) → L2(µ) belongs to this space iff T ⊗ idX is bounded on L2(X) for any θHilbertian space X. More generally, we are able to describe the spaces (B(ℓp0),B(ℓp1))θ or (B(Lp0),B(Lp1))θ for any pair 1 ≤ p0,p1 ≤ ∞ and 0 < θ < 1. In the same vein, given a locally compact Abelian group G, let M(G) (resp. PM(G)) be the space of complex measures (resp. pseudomeasures) on
Amenability of Banach algebras of compact operators
 Isrl. J. Math
, 1994
"... Abstract. In this paper we study conditions on a Banach space X that ensure that the Banach algebra K(X) of compact operators is amenable. We give a symmetrized approximation property of X which is proved to be such a condition. This property is satisfied by a wide range of Banach spaces including a ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we study conditions on a Banach space X that ensure that the Banach algebra K(X) of compact operators is amenable. We give a symmetrized approximation property of X which is proved to be such a condition. This property is satisfied by a wide range of Banach spaces including all the classical spaces. We then investigate which constructions of new Banach spaces from old ones preserve the property of carrying amenable algebras of compact operators. Roughly speaking, dual spaces, predual spaces and certain tensor products do inherit this property and direct sums do not. For direct sums this question is closely related to factorization of linear operators. In the final section we discuss some open questions, in particular, the converse problem of what properties of X are implied by the amenability of K(X). Amenability is a cohomological property of Banach algebras which was introduced in [J]. The definition is given below. It may be thought of as