Results 1  10
of
244
Face Recognition: A Literature Survey
, 2000
"... ... This paper provides an uptodate critical survey of still and videobased face recognition research. There are two underlying motivations for us to write this survey paper: the first is to provide an uptodate review of the existing literature, and the second is to offer some insights into ..."
Abstract

Cited by 859 (21 self)
 Add to MetaCart
... This paper provides an uptodate critical survey of still and videobased face recognition research. There are two underlying motivations for us to write this survey paper: the first is to provide an uptodate review of the existing literature, and the second is to offer some insights into the studies of machine recognition of faces. To provide a comprehensive survey, we not only categorize existing recognition techniques but also present detailed descriptions of representative methods within each category. In addition,
An introduction to kernelbased learning algorithms
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 2001
"... This paper provides an introduction to support vector machines (SVMs), kernel Fisher discriminant analysis, and ..."
Abstract

Cited by 375 (48 self)
 Add to MetaCart
This paper provides an introduction to support vector machines (SVMs), kernel Fisher discriminant analysis, and
In defense of onevsall classification
 Journal of Machine Learning Research
, 2004
"... Editor: John ShaweTaylor We consider the problem of multiclass classification. Our main thesis is that a simple “onevsall ” scheme is as accurate as any other approach, assuming that the underlying binary classifiers are welltuned regularized classifiers such as support vector machines. This the ..."
Abstract

Cited by 203 (0 self)
 Add to MetaCart
Editor: John ShaweTaylor We consider the problem of multiclass classification. Our main thesis is that a simple “onevsall ” scheme is as accurate as any other approach, assuming that the underlying binary classifiers are welltuned regularized classifiers such as support vector machines. This thesis is interesting in that it disagrees with a large body of recent published work on multiclass classification. We support our position by means of a critical review of the existing literature, a substantial collection of carefully controlled experimental work, and theoretical arguments.
Nonlinear Discriminant Analysis using Kernel Functions
 Advances in Neural Information Processing Systems
, 1999
"... Fishers linear discriminant analysis (LDA) is a classical multivariate technique both for dimension reduction and classication. The data vectors are transformed into a low dimensional subspace such that the class centroids are spread out as much as possible. In this subspace LDA works as a simple pr ..."
Abstract

Cited by 85 (7 self)
 Add to MetaCart
Fishers linear discriminant analysis (LDA) is a classical multivariate technique both for dimension reduction and classication. The data vectors are transformed into a low dimensional subspace such that the class centroids are spread out as much as possible. In this subspace LDA works as a simple prototype classier. The resulting decision boundaries are linear. However, in many applications the linear boundaries do not adequately separate the classes and the possibility of modeling more complex boundaries would be desirable. In this paper we present a nonlinear generalization of discriminant analysis that implements the method of representing dot products of pattern vectors by kernel functions. This technique allows to eciently compute discriminant functions in arbitrary feature spaces for which such kernel representations exist. 1 Introduction Classical linear discriminant analysis (LDA) is a statistical method that attempts to project data vectors that belong to c dierent classes...
Learning Gene Functional Classifications From Multiple Data Types
 JOURNAL OF COMPUTATIONAL BIOLOGY
, 2002
"... In our attempts to understand cellular function at the molecular level, we must be able to synthesize information from disparate types of genomic data. We consider the problem of inferring gene functional classifications from a heterogeneous data set consisting of DNA microarray expression measureme ..."
Abstract

Cited by 75 (1 self)
 Add to MetaCart
In our attempts to understand cellular function at the molecular level, we must be able to synthesize information from disparate types of genomic data. We consider the problem of inferring gene functional classifications from a heterogeneous data set consisting of DNA microarray expression measurements and phylogenetic profiles from wholegenome sequence comparisons. We demonstrate the application of the support vector machine (SVM) learning algorithm to this functional inference task. Our results suggest the importance of exploiting prior information about the heterogeneity of the data. In particular, we propose an SVM kernel function that is explicitly heterogeneous. In addition, we describe feature scaling methods for further exploiting prior knowledge of heterogeneity by giving each data type different weights.
WN: Gene functional classification from heterogeneous data
 In Proceedings of the Fifth Annual International Conference on Computational Biology: April 2225, 2001
"... In our attempts to understand cellular function at the molecular level, we must be able to synthesize information from disparate types of genomic data. We consider the problem of inferring gene functional classifications from a heterogeneous data set consisting of DNA microarray expression measureme ..."
Abstract

Cited by 70 (1 self)
 Add to MetaCart
In our attempts to understand cellular function at the molecular level, we must be able to synthesize information from disparate types of genomic data. We consider the problem of inferring gene functional classifications from a heterogeneous data set consisting of DNA microarray expression measurements and phylogenetic profiles from wholegenome sequence comparisons. We demonstrate the application of the support vector machine (SVM) learning algorithm to this functional inference task. Our results suggest the importance of exploiting prior information about the heterogeneity of the data. In particular, we propose an SVM kernel function that is explicitly heterogeneous. We also show how to use knowledge about heterogeneity to aid in feature selection. 1
A robust minimax approach to classification
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2002
"... When constructing a classifier, the probability of correct classification of future data points should be maximized. We consider a binary classification problem where the mean and covariance matrix of each class are assumed to be known. No further assumptions are made with respect to the classcondi ..."
Abstract

Cited by 61 (7 self)
 Add to MetaCart
When constructing a classifier, the probability of correct classification of future data points should be maximized. We consider a binary classification problem where the mean and covariance matrix of each class are assumed to be known. No further assumptions are made with respect to the classconditional distributions. Misclassification probabilities are then controlled in a worstcase setting: that is, under all possible choices of classconditional densities with given mean and covariance matrix, we minimize the worstcase (maximum) probability of misclassification of future data points. For a linear decision boundary, this desideratum is translated in a very direct way into a (convex) second order cone optimization problem, with complexity similar to a support vector machine problem. The minimax problem can be interpreted geometrically as minimizing the maximum of the Mahalanobis distances to the two classes. We address the issue of robustness with respect to estimation errors (in the means and covariances of the
A Mathematical Programming Approach to the Kernel Fisher Algorithm
, 2001
"... We investigate a new kernelbased classifier: the Kernel Fisher Discriminant (KFD). A mathematical programming formulation based on the observation that KFD maximizes the average margin permits an interesting modification of the original KFD algorithm yielding the sparse KFD. We find that both, KFD ..."
Abstract

Cited by 56 (14 self)
 Add to MetaCart
We investigate a new kernelbased classifier: the Kernel Fisher Discriminant (KFD). A mathematical programming formulation based on the observation that KFD maximizes the average margin permits an interesting modification of the original KFD algorithm yielding the sparse KFD. We find that both, KFD and the proposed sparse KFD, can be understood in an unifying probabilistic context. Furthermore, we show connections to Support Vector Machines and Relevance Vector Machines. From this understanding, we are able to outline an interesting kernelregression technique based upon the KFD algorithm. Simulations support the usefulness of our approach.
KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition
 IEEE Transactions on Pattern Analysis and Machine Intelligence
"... Abstract—This paper examines the theory of kernel Fisher discriminant analysis (KFD) in a Hilbert space and develops a twophase KFD framework, i.e., kernel principal component analysis (KPCA) plus Fisher linear discriminant analysis (LDA). This framework provides novel insights into the nature of K ..."
Abstract

Cited by 55 (4 self)
 Add to MetaCart
Abstract—This paper examines the theory of kernel Fisher discriminant analysis (KFD) in a Hilbert space and develops a twophase KFD framework, i.e., kernel principal component analysis (KPCA) plus Fisher linear discriminant analysis (LDA). This framework provides novel insights into the nature of KFD. Based on this framework, the authors propose a complete kernel Fisher discriminant analysis (CKFD) algorithm. CKFD can be used to carry out discriminant analysis in “double discriminant subspaces. ” The fact that, it can make full use of two kinds of discriminant information, regular and irregular, makes CKFD a more powerful discriminator. The proposed algorithm was tested and evaluated using the FERET face database and the CENPARMI handwritten numeral database. The experimental results show that CKFD outperforms other KFD algorithms. Index Terms—Kernelbased methods, subspace methods, principal component analysis (PCA), Fisher linear discriminant analysis (LDA or FLD), feature extraction, machine learning, face recognition, handwritten digit recognition. æ 1
Locally Linear Discriminant Analysis for Multimodally Distributed Classes for Face Recognition with a Single Model Image
 IEEE Trans. Pattern Analysis and Machine Intelligence
, 2005
"... Abstract—We present a novel method of nonlinear discriminant analysis involving a set of locally linear transformations called “Locally Linear Discriminant Analysis (LLDA). ” The underlying idea is that global nonlinear data structures are locally linear and local structures can be linearly aligned. ..."
Abstract

Cited by 45 (3 self)
 Add to MetaCart
Abstract—We present a novel method of nonlinear discriminant analysis involving a set of locally linear transformations called “Locally Linear Discriminant Analysis (LLDA). ” The underlying idea is that global nonlinear data structures are locally linear and local structures can be linearly aligned. Input vectors are projected into each local feature space by linear transformations found to yield locally linearly transformed classes that maximize the betweenclass covariance while minimizing the withinclass covariance. In face recognition, linear discriminant analysis (LDA) has been widely adopted owing to its efficiency, but it does not capture nonlinear manifolds of faces which exhibit pose variations. Conventional nonlinear classification methods based on kernels such as generalized discriminant analysis (GDA) and support vector machine (SVM) have been developed to overcome the shortcomings of the linear method, but they have the drawback of high computational cost of classification and overfitting. Our method is for multiclass nonlinear discrimination and it is computationally highly efficient as compared to GDA. The method does not suffer from overfitting by virtue of the linear base structure of the solution. A novel gradientbased learning algorithm is proposed for finding the optimal set of local linear bases. The optimization does not exhibit a localmaxima problem. The transformation functions facilitate robust face recognition in a lowdimensional subspace, under pose variations, using a single model image. The classification results are given for both synthetic and real face data. Index Terms—Linear discriminant analysis, generalized discriminant analysis, support vector machine, dimensionality reduction, face recognition, feature extraction, pose invariance, subspace representation. æ 1