Results 1  10
of
128
Sparseness of support vector machines
"... Support vector machines (SVMs) construct decision functions that are linear combinations of kernel evaluations on the training set. The samples with nonvanishing coefficients are called support vectors. In this work we establish lower (asymptotical) bounds on the number of support vectors. On our w ..."
Abstract

Cited by 136 (21 self)
 Add to MetaCart
Support vector machines (SVMs) construct decision functions that are linear combinations of kernel evaluations on the training set. The samples with nonvanishing coefficients are called support vectors. In this work we establish lower (asymptotical) bounds on the number of support vectors. On our way we prove several results which are of great importance for the understanding of SVMs. In particular, we describe to which “limit ” SVM decision functions tend, discuss the corresponding notion of convergence and provide some results on the stability of SVMs using subdifferential calculus in the associated reproducing kernel Hilbert space.
On robust properties of convex risk minimization methods for pattern recognition
 Journal of Machine Learning Research
, 2004
"... The paper brings together methods from two disciplines: machine learning theory and robust statistics. We argue that robustness is an important aspect and we show that many existing machine learning methods based on the convex risk minimization principle have − besides other good properties − also t ..."
Abstract

Cited by 25 (11 self)
 Add to MetaCart
The paper brings together methods from two disciplines: machine learning theory and robust statistics. We argue that robustness is an important aspect and we show that many existing machine learning methods based on the convex risk minimization principle have − besides other good properties − also the advantage of being robust. Robustness properties of machine learning methods based on convex risk minimization are investigated for the problem of pattern recognition. Assumptions are given for the existence of the influence function of the classifiers and for bounds of the influence function. Kernel logistic regression, support vector machines, least squares and the AdaBoost loss function are treated as special cases. Some results on the robustness of such methods are also obtained for the sensitivity curve and the maxbias, which are two other robustness criteria. A sensitivity analysis of the support vector machine is given.
Property (T) and rigidity for actions on Banach spaces
 BHV] [BoS] [Bou] [BuSc] [BuSc’] [BrSo] [C] M. B. Bekka, P. de la Harpe, Alain Valette. “Kazhdan’s
, 2005
"... Abstract. We study property (T) and the fixed point property for actions on L p and other Banach spaces. We show that property (T) holds when L 2 is replaced by L p (and even a subspace/quotient of L p), and that in fact it is independent of 1 ≤ p < ∞. We show that the fixed point property for L p f ..."
Abstract

Cited by 24 (4 self)
 Add to MetaCart
Abstract. We study property (T) and the fixed point property for actions on L p and other Banach spaces. We show that property (T) holds when L 2 is replaced by L p (and even a subspace/quotient of L p), and that in fact it is independent of 1 ≤ p < ∞. We show that the fixed point property for L p follows from property (T) when 1 < p < 2 +ε. For simple Lie groups and their lattices, we prove that the fixed point property for L p holds for any 1 < p < ∞ if and only if the rank is at least two. Finally, we obtain a superrigidity result for actions of irreducible lattices in products of general groups on superreflexive Banach spaces.
Consistency and robustness of kernel based regression
 of Dortmund, SFB475, TR01/05. Submitted
, 2005
"... We investigate properties of kernel based regression (KBR) methods which are inspired by the convex risk minimization method of support vector machines. We first describe the relation between the used loss function of the KBR method and the tail of the response variable Y. We then establish a consis ..."
Abstract

Cited by 21 (13 self)
 Add to MetaCart
We investigate properties of kernel based regression (KBR) methods which are inspired by the convex risk minimization method of support vector machines. We first describe the relation between the used loss function of the KBR method and the tail of the response variable Y. We then establish a consistency result for KBR and give assumptions for the existence of the influence function. In particular, our results allow to choose the loss function and the kernel to obtain computational tractable and consistent KBR methods having bounded influence functions. Furthermore, bounds for the sensitivity curve which is a finite sample version of the influence function are developed, and some numerical experiments are discussed. 1.
The Dirichlet Problem for the Total Variation Flow
, 2001
"... We introduce a new concept of solution for the Dirichlet problem for the total variational flow named entropy solution. Using Kruzhkov's method of doubling variables both in space and in time we prove uniqueness and a comparison principle in L¹ for entropy solutions. To prove the existence we use th ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
We introduce a new concept of solution for the Dirichlet problem for the total variational flow named entropy solution. Using Kruzhkov's method of doubling variables both in space and in time we prove uniqueness and a comparison principle in L¹ for entropy solutions. To prove the existence we use the nonlinear semigroup theory and we show that when the initial and boundary data are nonnegative the semigroup solutions are strong solutions.
Lectures on Young Measure Theory and its Applications in Economics
 Rend. Istit. Mat. Univ. Trieste
, 1998
"... this paper we work with the following hypothesis: ..."
Continuumsites steppingstone models, coalescing exchangeable partitions, and random trees
, 1998
"... Analogues of steppingstone models are considered where the sitespace is continuous, the migration process is a general Markov process, and the type{space is infinite. Such processes were defined in previous work of the second author by specifying a Feller transition semigroup in terms of expectati ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
Analogues of steppingstone models are considered where the sitespace is continuous, the migration process is a general Markov process, and the type{space is infinite. Such processes were defined in previous work of the second author by specifying a Feller transition semigroup in terms of expectations of suitable functionals for systems of coalescing Markov processes. An alternative representation is obtained here in terms of a limit of interacting particle systems. It is shown that, under a mild condition on the migration process, the continuumsites steppingstone process has continuous sample paths. The case when the migration process is Brownian motion on the circle is examined in detail using a duality relation between coalescing and annihilating Brownian motion. This duality relation is also used to show that a random compact metric space that is naturally associated to an infinite family of coalescing Brownian motions on the circle has Hausdorff and packing dimension both almost surely equal to 1/2 and, moreover, this space is capacity equivalent to the middle1/2 Cantor set (and hence also to the Brownian zero set).
Approximation Properties for Noncommutative LpSpaces Associated with Discrete Groups
"... Abstract. Let 1 < p < ∞. It is shown that if G is a discrete group with the approximation property introduced by Haagerup and Kraus, then the noncommutative Lp(V N(G)) space has the operator space approximation property. If, in addition, the group von Neumann algebra V N(G) has the QWEP, i.e. is a q ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
Abstract. Let 1 < p < ∞. It is shown that if G is a discrete group with the approximation property introduced by Haagerup and Kraus, then the noncommutative Lp(V N(G)) space has the operator space approximation property. If, in addition, the group von Neumann algebra V N(G) has the QWEP, i.e. is a quotient of a C ∗algebra with Lance’s weak expectation property, then Lp(V N(G)) actually has the completely contractive approximation property and the approximation maps can be chosen to be finiterank completely contractive multipliers on Lp(V N(G)). Finally, we show that if G is a countable discrete group having the approximation property and V N(G) has the QWEP, then Lp(V N(G)) has a very nice local structure, i.e. it is a COLp space and has a completely bounded Schauder basis. 1.
Dual Banach algebras: representations and injectivity
, 2008
"... We study representations of Banach algebras on reflexive Banach spaces. Algebras which admit such representations which are bounded below seem to be a good generalisation of Arens regular Banach algebras; this class includes dual Banach algebras as defined by Runde, but also all group algebras, and ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
We study representations of Banach algebras on reflexive Banach spaces. Algebras which admit such representations which are bounded below seem to be a good generalisation of Arens regular Banach algebras; this class includes dual Banach algebras as defined by Runde, but also all group algebras, and all discrete (weakly cancellative) semigroup algebras. Such algebras also behave in a similar way to C ∗and W ∗algebras; we show that interpolation space techniques can be used in the place of GNS type arguments. We define a notion of injectivity for dual Banach algebras, and show that this is equivalent to Connesamenability. We conclude by looking at the problem of defining a wellbehaved tensor product for dual Banach algebras.