Results 1 
4 of
4
Nonlinear component analysis as a kernel eigenvalue problem

, 1996
"... We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all ..."
Abstract

Cited by 1048 (72 self)
 Add to MetaCart
We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all possible 5pixel products in 16x16 images. We give the derivation of the method, along with a discussion of other techniques which can be made nonlinear with the kernel approach; and present first experimental results on nonlinear feature extraction for pattern recognition.
Support Vector Machines for MultiClass Pattern Recognition
, 1999
"... . The solution of binary classification problems using support vector machines (SVMs) is well developed, but multiclass problems with more than two classes have typically been solved by combining independently produced binary classifiers. We propose a formulation of the SVM that enables a multicla ..."
Abstract

Cited by 144 (5 self)
 Add to MetaCart
. The solution of binary classification problems using support vector machines (SVMs) is well developed, but multiclass problems with more than two classes have typically been solved by combining independently produced binary classifiers. We propose a formulation of the SVM that enables a multiclass pattern recognition problem to be solved in a single optimisation. We also propose a similar generalization of linear programming machines. We report experiments using benchmark datasets in which these two methods achieve a reduction in the number of support vectors and kernel calculations needed.
Properties of Support Vector Machines
, 1997
"... Support Vector Machines (SVMs) perform pattern recognition between two point classes by nding a decision surface determined by certain points of the training set, termed Support Vectors (SV). This surface, which in some feature space of possibly in nite dimension can be regarded as a hyperplane, is ..."
Abstract

Cited by 29 (4 self)
 Add to MetaCart
Support Vector Machines (SVMs) perform pattern recognition between two point classes by nding a decision surface determined by certain points of the training set, termed Support Vectors (SV). This surface, which in some feature space of possibly in nite dimension can be regarded as a hyperplane, is obtained from the solution of a problem of quadratic programming that depends on a regularization parameter. In this paper we study some mathematical properties of support vectors and show that the decision surface can be written as the sum of two orthogonal terms, the rst depending only on the margin vectors (which are SVs lying on the margin), the second proportional to the regularization parameter. For almost all values of the parameter, this enables us to predict how the decision surface varies for small parameter changes. In the special but important case of feature space of finite dimension m, we also show that there are at most m + 1 margin vectors and observe thatm + 1 SVs are usually sufficient to fully determine the decision surface. For relatively small m this latter result leads to a consistent reduction of the SV number.
Title of Dissertation: LEARNING ALGORITHMS FOR AUDIO AND VIDEO PROCESSING { INDEPENDENT COMPONENT ANALYSIS AND SUPPORT VECTOR MACHINE BASED APPROACHES
"... hascopyright hascopyrighttrue has gures has gurestrue hastables hastablestrue ..."
Abstract
 Add to MetaCart
hascopyright hascopyrighttrue has gures has gurestrue hastables hastablestrue