DMCA
KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition (2005)
Cached
Download Links
- [www.dtic.upf.edu]
- [www.tecn.upf.es]
- [www.cistib.org]
- [www.tecn.upf.es]
- [www.dtic.upf.edu]
- [repository.lib.polyu.edu.hk]
- DBLP
Other Repositories/Bibliography
Venue: | IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE |
Citations: | 139 - 7 self |
Citations
13215 | Statistical Learning Theory
- Vapnik
- 1998
(Show Context)
Citation Context ..., feature extraction, machine learning, face recognition, handwritten digit recognition. æ 1 INTRODUCTION OVER the last few years, kernel-based learning machines, e.g., support vector machines (SVMs) =-=[1]-=-, kernel principal component analysis (KPCA), and kernel Fisher discriminant analysis (KFD), have aroused considerable interest in the fields of pattern recognition and machine learning [2]. KPCA was ... |
3879 | Eigenfaces for recognition - Turk, Pentland - 1991 |
3781 |
Introduction to Statistical Pattern Recognition
- Fukunaga
- 1990
(Show Context)
Citation Context ...ays holds for every nonzero vector ’. In such a case, the Fisher criterion can be directly employed to extract a set of optimal discriminant vectors (projection axes) using the standard LDA algorithm =-=[35]-=-. Its physical meaning is that, after the projection of samples onto these axes, the ratio of the between-class scatter to the withinclass scatter is maximized. However, in a high-dimensional (even in... |
2827 | Learning with kernels - Schölkopf, Smola - 2002 |
2572 |
Functional Analysis
- Rudin
- 1991
(Show Context)
Citation Context ...r, 3. positive operator, and 4. self-adjoint (symmetric) operator on Hilbert space H. The proof is given in Appendix A. Since every eigenvalue of a positive operator is nonnegative in a Hilbert space =-=[48]-=-, from Lemma 1, it follows that all nonzero eigenvalues of St are positive. It is these positive eigenvalues that are of interest to us. Schölkopf et al. [3] have suggested the following way to find t... |
2309 | Eigenfaces vs. fisherfaces: Recognition using class specific linear proposed systemion.
- Belhumeur, Hespanha, et al.
- 1997
(Show Context)
Citation Context ...r matrix. Baudat and Anouar [6] employed the QR decomposition technique to avoid the singularity by removing the zero eigenvalues. Yang [11] exploited the PCA plus LDA technique adopted in Fisherface =-=[20]-=- to deal with the problem. Unfortunately, all of these methods discard the discriminant information contained in the null space of the within-class covariance matrix, yet this discriminant information... |
1572 | Nonlinear Component Analysis as a Kernel Eigenvalue Problem
- SchÄolkopf, Smola, et al.
- 1998
(Show Context)
Citation Context ... kernel Fisher discriminant analysis (KFD), have aroused considerable interest in the fields of pattern recognition and machine learning [2]. KPCA was originally developed by Schölkopf et al. in 1998 =-=[3]-=-, while KFD was first proposed by Mika et al. in 1999 [4], [5]. Subsequent research saw the development of a series of KFD algorithms (see Baudat and Anouar [6], Roth and Steinhage [7], Mika et al. [8... |
1361 |
Solutions of Ill-posed Problems
- Tikhonov, Arsenin
- 1977
(Show Context)
Citation Context ...[12], [13], [14], [15], [16], [17], KFD has been found to be very effective in many real-world applications. KFD, however, always encounters the ill-posed problem in its real-world applications [10], =-=[18]-=-. A number of regularization techniques that might alleviate this problem have been . J. Yang, J.-y. Yang, and Z. Jin are with the Department of Computer Science, Nanjing University of Science and Tec... |
1116 | The FERET evaluation methodology for face-recognition algorithms. - Phillips, Moon, et al. - 2000 |
596 | An Introduction to Kernel–Based Learning Algorithms,
- Müller, Mika, et al.
- 2001
(Show Context)
Citation Context ...nes (SVMs) [1], kernel principal component analysis (KPCA), and kernel Fisher discriminant analysis (KFD), have aroused considerable interest in the fields of pattern recognition and machine learning =-=[2]-=-. KPCA was originally developed by Schölkopf et al. in 1998 [3], while KFD was first proposed by Mika et al. in 1999 [4], [5]. Subsequent research saw the development of a series of KFD algorithms (se... |
577 |
The Theory of Matrices
- Lancaster, Tismenetsky
- 1985
(Show Context)
Citation Context ... T Sb ; ðjj jj1Þ; ð17Þ where Sb P T S b P and Sw P T S w P. It is easy to show that Sb and Sw are both m m semipositive definite matrices. This means that Jð Þ is a generalized Rayleigh quotient =-=[34]-=- and Jbð Þ is a Rayleigh quotient in the isomorphic space IR m . Note that Jbð Þ is viewed as a Rayleigh quotient because the T formula Sb ðjj jj 1) is equivalent to TSb T [34]. Under the isomorphic... |
508 | Using discriminant eigenfeatures for image retrieval.
- Swets, Weng
- 1996
(Show Context)
Citation Context ...d more effective KFD algorithms to deal with them. Fisher linear discriminant analysis has been well studied and widely applied to SSS problems in recent years. Many LDA algorithms have been proposed =-=[19]-=-, [20], [21], [22], [23], [24], [25], [26], [27], [28], [29]. The most famous method is Fisherface [19], [20], which is based on a two-phase framework: PCA plus LDA. The effectiveness of this framewor... |
503 | Fisher discriminant analysis with kernels - Mika, Rätsch, et al. - 1999 |
449 |
Introductory Functional Analysis with Applications
- Kreyszig
- 1990
(Show Context)
Citation Context ...ectors In this section, we will offer our idea of calculating Fisher optimal discriminant vectors in the reduced search space t. Since the dimension of t is m, according to functional analysis theory =-=[47]-=-, t is isomorphic to m-dimensional Euclidean space IR m The corresponding isomorphic mapping is ’ P ; where P ð 1; 2; ...; mÞ; 2 IR m ; ð15Þ which is a one-to-one mapping from IR m onto t. Under th... |
336 | Generalized discriminant analysis using a kernel approach
- Baudat, Anouar
- 2000
(Show Context)
Citation Context ... system 10 times and obtain 10 different training and testing sample sets for performance evaluation. Here, the polynomial kernel and Gaussian RBF kernel are both involved. The standard LDA [35], GDA =-=[6]-=-, and three versions of CKFD (regular, irregular, and fusion) are tested and evaluated. A minimum distance classifier is employed for computational efficiency. The model selection process is performed... |
297 | A direct LDA algorithm for high-dimensional data—with application to face recognition,
- Yu, Yang
- 2001
(Show Context)
Citation Context ...d the discriminant information contained in the null space of the within-class covariance matrix, yet this discriminant information is very effective for “small sample size” (SSS) problem [21], [22], =-=[23]-=-, [24], [25]. Lu et al. [12] have taken this issue into account and presented kernel direct discriminant analysis (KDDA) by generalization of the direct-LDA [23]. In real-world applications, particula... |
252 | A Krishnaswamy, "Discriminant Analysis of Principal Components for Face Recognition,"
- Zhao, Chellappa
- 1998
(Show Context)
Citation Context ...near discriminant analysis has been well studied and widely applied to SSS problems in recent years. Many LDA algorithms have been proposed [19], [20], [21], [22], [23], [24], [25], [26], [27], [28], =-=[29]-=-. The most famous method is Fisherface [19], [20], which is based on a two-phase framework: PCA plus LDA. The effectiveness of this framework in image recognition has been broadly demonstrated [19], [... |
245 |
Linear Operators in Hilbert Spaces
- Weidmann
- 1980
(Show Context)
Citation Context ...uppose its orthogonal complementary space is denoted by ? t . Actually, ? t is the null space of St . Since t, due to its finite dimensionality, is a closed subspace of H, from the Projection theorem =-=[50]-=-, we have ? Corollary 1. H t t . That is, for an arbitrary vector ’ 2 H;’ can be uniquely represented in the form ’ þ with 2 t and 2 ? t . Now, let us define a mapping L : H! t by ’ þ ! ; ð13Þ wh... |
238 |
A new lda-based face recognition system which can solve the small sample size problem.
- Chen, Liao, et al.
- 2001
(Show Context)
Citation Context ...ARY 2005 Fig. 2. Images of one person in the FERET database. (a) Original images. (b) Cropped images (after histogram equalization) corresponding to images in (a). [12], [13], [14], [15], [16], [17], =-=[22]-=-, [23], which can yield only one discriminant subspace containing at most c 1 discriminant features. What is more, CKFD provides a new mechanism for decision fusion. This mechanism makes it possible t... |
190 | Improving the accuracy and speed of support vector learning machines
- Burges, Schölkopf
- 1997
(Show Context)
Citation Context ...eigenproblem (or generalized eigenproblem). When the sample size M is fairly large, it becomes very computationally intensive [10]. Several ways suggested by Mika et al. [10] and Burges and Schölkopf =-=[45]-=- can be used to deal with this problem, but the optimal implementation scheme (e.g., developing a more efficient numerical algorithm for large scale eigenproblems) is still open. APPENDIX A THE PROOF ... |
186 |
Kernel Eigenfaces vs. Kernel fisherfaces: face recognition using Kernel methods,”
- Yang
- 2002
(Show Context)
Citation Context ...st proposed by Mika et al. in 1999 [4], [5]. Subsequent research saw the development of a series of KFD algorithms (see Baudat and Anouar [6], Roth and Steinhage [7], Mika et al. [8], [9], [10], Yang =-=[11]-=-, Lu et al. [12], Xu et al. [13], Billings and Lee [14], Gestel et al. [15], Cawley and Talbot [16], and Lawrence and Schölkopf [17]). The KFD algorithms developed by Mika et al. are formulated for tw... |
143 | Face recognition using kernel direct discriminant analysis algorithms,
- Lu, Plataniotis, et al.
- 2003
(Show Context)
Citation Context ... INTELLIGENCE, VOL. 27, NO. 2, FEBRUARY 2005 Fig. 2. Images of one person in the FERET database. (a) Original images. (b) Cropped images (after histogram equalization) corresponding to images in (a). =-=[12]-=-, [13], [14], [15], [16], [17], [22], [23], which can yield only one discriminant subspace containing at most c 1 discriminant features. What is more, CKFD provides a new mechanism for decision fusion... |
136 | Incremental linear discriminant analysis for face recognition.
- Zhao, Yuen
- 2008
(Show Context)
Citation Context ...her linear discriminant analysis has been well studied and widely applied to SSS problems in recent years. Many LDA algorithms have been proposed [19], [20], [21], [22], [23], [24], [25], [26], [27], =-=[28]-=-, [29]. The most famous method is Fisherface [19], [20], which is based on a two-phase framework: PCA plus LDA. The effectiveness of this framework in image recognition has been broadly demonstrated [... |
99 | Nonlinear discriminant analysis using kernel functions.
- Roth, Steinhage
- 2000
(Show Context)
Citation Context ... et al. in 1998 [3], while KFD was first proposed by Mika et al. in 1999 [4], [5]. Subsequent research saw the development of a series of KFD algorithms (see Baudat and Anouar [6], Roth and Steinhage =-=[7]-=-, Mika et al. [8], [9], [10], Yang [11], Lu et al. [12], Xu et al. [13], Billings and Lee [14], Gestel et al. [15], Cawley and Talbot [16], and Lawrence and Schölkopf [17]). The KFD algorithms develop... |
98 | Analyzing PCA-based Face Recognition Algorithms: Eigenvector Selection and Distance Measures”,
- Yambor, Draper, et al.
- 2002
(Show Context)
Citation Context ...isherface. Is CKFD statistically significantly better than other methods in terms of its recognition rate? To answer this question, let us evaluate the experimental results in Table 3 using McNemar’s =-=[39]-=-, [40], [41] significance test. McNemar’s test is essentially a null hypothesis statistical test based on a Bernoulli model. If the resulting p-value is below the desired significance level (for examp... |
89 |
Applications of Functional Analysis and Operator Theory,
- Hutson, Pym
- 1980
(Show Context)
Citation Context ...tressed that we would not like to lose any effective discriminant information in the process of space reduction. To this end, some theory should be developed first. Theorem 1 (Hilbert-Schmidt Theorem =-=[49]-=-). Let A be a compact and self-adjoint operator on Hilbert space H. Then, its eigenvector system forms an orthonormal basis for H. Since St is compact and self-adjoint, it follows from Theorem 1 that ... |
73 | Contructing descriptive and discriminative nonlinear features: Rayleigh coefficients in kernel feature spaces
- Mika, Rätsch, et al.
- 2003
(Show Context)
Citation Context ... because all kernel-based discriminant methods have to solve an M M sized eigenproblem (or generalized eigenproblem). When the sample size M is fairly large, it becomes very computationally intensive =-=[10]-=-. Several ways suggested by Mika et al. [10] and Burges and Schölkopf [45] can be used to deal with this problem, but the optimal implementation scheme (e.g., developing a more efficient numerical alg... |
70 | A mathematical programming approach to the Kernel Fisher algorithm
- Mika, Rätsch, et al.
- 2001
(Show Context)
Citation Context ... decision by modifying the fusion coefficient. CKFD has a computational complexity of OðM3Þ (M is the number of training samples), which is the same as the existing KFD algorithms [4], [5], [6], [7], =-=[8]-=-, [9], [10], [11], [12], [13], [14], [15], [16], [17]. The reason for this is that the KPCA phase of CKFD is actually carried out in the space spanned by M training samples, so its computational compl... |
63 |
Recognizing faces with
- Draper, Baek, et al.
- 2003
(Show Context)
Citation Context ...s CKFD statistically significantly better than other methods in terms of its recognition rate? To answer this question, let us evaluate the experimental results in Table 3 using McNemar’s [39], [40], =-=[41]-=- significance test. McNemar’s test is essentially a null hypothesis statistical test based on a Bernoulli model. If the resulting p-value is below the desired significance level (for example, 0.02), t... |
55 | Efficient leave-one-out crossvalidation of kernel Fisher discriminant classifiers.
- Cawley, Talbot
- 2003
(Show Context)
Citation Context ... KFD algorithms (see Baudat and Anouar [6], Roth and Steinhage [7], Mika et al. [8], [9], [10], Yang [11], Lu et al. [12], Xu et al. [13], Billings and Lee [14], Gestel et al. [15], Cawley and Talbot =-=[16]-=-, and Lawrence and Schölkopf [17]). The KFD algorithms developed by Mika et al. are formulated for two classes, while those of Baudat and Anouar are formulated for multiple classes. Because of its abi... |
52 | Robust coding schemes for indexing and retrieval from large face databases
- Liu, Wechsler
(Show Context)
Citation Context ...m. Fisher linear discriminant analysis has been well studied and widely applied to SSS problems in recent years. Many LDA algorithms have been proposed [19], [20], [21], [22], [23], [24], [25], [26], =-=[27]-=-, [28], [29]. The most famous method is Fisherface [19], [20], which is based on a two-phase framework: PCA plus LDA. The effectiveness of this framework in image recognition has been broadly demonstr... |
47 | Application of the KL Procedure for the Characterization of Human Faces. - Kirby, Sirovich - 1990 |
42 | A shape- and texture-based enhanced Fisher classifier for face recognition,
- Liu, Wechsler
- 2001
(Show Context)
Citation Context ...th them. Fisher linear discriminant analysis has been well studied and widely applied to SSS problems in recent years. Many LDA algorithms have been proposed [19], [20], [21], [22], [23], [24], [25], =-=[26]-=-, [27], [28], [29]. The most famous method is Fisherface [19], [20], which is based on a two-phase framework: PCA plus LDA. The effectiveness of this framework in image recognition has been broadly de... |
41 | An improved training algorithm for kernel fisher discriminants,” in
- Mika, Smola, et al.
- 2001
(Show Context)
Citation Context ...hile KFD was first proposed by Mika et al. in 1999 [4], [5]. Subsequent research saw the development of a series of KFD algorithms (see Baudat and Anouar [6], Roth and Steinhage [7], Mika et al. [8], =-=[9]-=-, [10], Yang [11], Lu et al. [12], Xu et al. [13], Billings and Lee [14], Gestel et al. [15], Cawley and Talbot [16], and Lawrence and Schölkopf [17]). The KFD algorithms developed by Mika et al. are ... |
27 |
Nonlinear Fisher discriminant analysis using a minimum squared error cost function and the orthogonal least squares algorithm
- Billings, Lee
- 2002
(Show Context)
Citation Context ... research saw the development of a series of KFD algorithms (see Baudat and Anouar [6], Roth and Steinhage [7], Mika et al. [8], [9], [10], Yang [11], Lu et al. [12], Xu et al. [13], Billings and Lee =-=[14]-=-, Gestel et al. [15], Cawley and Talbot [16], and Lawrence and Schölkopf [17]). The KFD algorithms developed by Mika et al. are formulated for two classes, while those of Baudat and Anouar are formula... |
23 |
Kernel MSE algorithm: a unified framework for KFD, LS-SVM and KRR.
- Xu, Zhang, et al.
- 2001
(Show Context)
Citation Context ...99 [4], [5]. Subsequent research saw the development of a series of KFD algorithms (see Baudat and Anouar [6], Roth and Steinhage [7], Mika et al. [8], [9], [10], Yang [11], Lu et al. [12], Xu et al. =-=[13]-=-, Billings and Lee [14], Gestel et al. [15], Cawley and Talbot [16], and Lawrence and Schölkopf [17]). The KFD algorithms developed by Mika et al. are formulated for two classes, while those of Baudat... |
23 |
an Efficient Algorithm for Foley-Sammon Optimal Set of Discriminant Vectors by Algebraic Method
- Liu, Cheng, et al.
- 1992
(Show Context)
Citation Context ...thods discard the discriminant information contained in the null space of the within-class covariance matrix, yet this discriminant information is very effective for “small sample size” (SSS) problem =-=[21]-=-, [22], [23], [24], [25]. Lu et al. [12] have taken this issue into account and presented kernel direct discriminant analysis (KDDA) by generalization of the direct-LDA [23]. In real-world application... |
21 |
Bayesian framework for least squares support vector machine classifiers Gaussian processes and Kernel Fisher discriminant analysis.
- Van, Suykens, et al.
- 2002
(Show Context)
Citation Context ...velopment of a series of KFD algorithms (see Baudat and Anouar [6], Roth and Steinhage [7], Mika et al. [8], [9], [10], Yang [11], Lu et al. [12], Xu et al. [13], Billings and Lee [14], Gestel et al. =-=[15]-=-, Cawley and Talbot [16], and Lawrence and Schölkopf [17]). The KFD algorithms developed by Mika et al. are formulated for two classes, while those of Baudat and Anouar are formulated for multiple cla... |
20 |
Feature fusion: parallel strategy vs. serial strategy
- Yang, Yang, et al.
- 2003
(Show Context)
Citation Context ... the ratio curve levels off. 5.2 Experiment on Handwritten Digit Classification Using CENPARMI Database In this experiment, we use the Concordia University CENPARMI handwritten numeral database [42], =-=[44]-=-. This database contains 6,000 samples of 10 numeral classes (each class has 600 samples). Here, our experiment is performed based on 256-dimensional Gabor transformation features [43], [44], which tu... |
18 |
The Facial Recognition Technology (FERET) Database,”
- Phillips
- 2004
(Show Context)
Citation Context ...riment on Face Recognition Using the FERET Database The FERET face image database is a result of the FERET program, which was sponsored by the US Department of Defense through the DARPA Program [36], =-=[37]-=-. It has become a standard database for testing and evaluating state-of-the-art face recognition algorithms. The proposed algorithm was tested on a subset of the FERET database. This subset includes 1... |
11 |
Invariant Feature Extraction and Classification
- Mika, Ratsch, et al.
- 2000
(Show Context)
Citation Context ...derable interest in the fields of pattern recognition and machine learning [2]. KPCA was originally developed by Schölkopf et al. in 1998 [3], while KFD was first proposed by Mika et al. in 1999 [4], =-=[5]-=-. Subsequent research saw the development of a series of KFD algorithms (see Baudat and Anouar [6], Roth and Steinhage [7], Mika et al. [8], [9], [10], Yang [11], Lu et al. [12], Xu et al. [13], Billi... |
11 |
Statistics: The Exploration and Analysis of Data (third edition)”,
- Devore, Peck
- 1997
(Show Context)
Citation Context ...ace. Is CKFD statistically significantly better than other methods in terms of its recognition rate? To answer this question, let us evaluate the experimental results in Table 3 using McNemar’s [39], =-=[40]-=-, [41] significance test. McNemar’s test is essentially a null hypothesis statistical test based on a Bernoulli model. If the resulting p-value is below the desired significance level (for example, 0.... |
10 |
Optimal FLD algorithm for facial feature extraction
- Yang, Yang
- 2001
(Show Context)
Citation Context ...minant information contained in the null space of the within-class covariance matrix, yet this discriminant information is very effective for “small sample size” (SSS) problem [21], [22], [23], [24], =-=[25]-=-. Lu et al. [12] have taken this issue into account and presented kernel direct discriminant analysis (KDDA) by generalization of the direct-LDA [23]. In real-world applications, particularly in image... |
10 |
Fuzzy Kernel fisher discriminant algorithm with application to face recognition,”
- Zheng, Yang, et al.
- 2006
(Show Context)
Citation Context ...han Chen and Yu’s methods, which can extract at most c 1 features. In addition, our LDA algorithm [24] is more powerful and simpler than Liu et al.’s [21] method [52]. The algorithm in the literature =-=[32]-=- can be viewed as a nonlinear generalization of that in [24]. However, the derivation of the algorithm is based on an assumption that the feature space is assumed to be a finite dimensional space. Thi... |
7 | Combined fisherfaces framework
- Yang, Yang, et al.
(Show Context)
Citation Context ...ncludes 1,400 images of 200 individuals (each individual has seven images). It is composed of the images whose names are marked with twocharacter strings: “ba,” “bj,” “bk,” “be,” “bf,” “bd,” and “bg” =-=[51]-=-. This subset involves variations in facial expression, illumination, and pose. In our experiment, the facial portion of each original image was automatically cropped based on the location of eyes and... |
4 | A generalized K-L expansion method which can deal with small sample size and high- dimensional problems - Yang, Zhang, et al. - 2003 |
3 |
Estimating a Kernel Fisher Discriminant
- Lawrence, Schölkopf
- 2001
(Show Context)
Citation Context ...nouar [6], Roth and Steinhage [7], Mika et al. [8], [9], [10], Yang [11], Lu et al. [12], Xu et al. [13], Billings and Lee [14], Gestel et al. [15], Cawley and Talbot [16], and Lawrence and Schölkopf =-=[17]-=-). The KFD algorithms developed by Mika et al. are formulated for two classes, while those of Baudat and Anouar are formulated for multiple classes. Because of its ability to extract the most discrimi... |
3 |
Recognition of Handwritten Numerals Using Gabor Features
- Hamamoto, Uchimura, et al.
- 1996
(Show Context)
Citation Context ... database [42], [44]. This database contains 6,000 samples of 10 numeral classes (each class has 600 samples). Here, our experiment is performed based on 256-dimensional Gabor transformation features =-=[43]-=-, [44], which turned out to be effective for handwritten digit classification. In our experiments, 100 samplesare randomly chosen from each class for training, while the remaining 500 samples are used... |
2 |
Rejection Criteria and Pairwise Discrimination of Handwritten Numerals
- Lou, Liu, et al.
- 1992
(Show Context)
Citation Context ...an 2), the ratio curve levels off. 5.2 Experiment on Handwritten Digit Classification Using CENPARMI Database In this experiment, we use the Concordia University CENPARMI handwritten numeral database =-=[42]-=-, [44]. This database contains 6,000 samples of 10 numeral classes (each class has 600 samples). Here, our experiment is performed based on 256-dimensional Gabor transformation features [43], [44], wh... |
2 |
Theory of Fisher linear discriminant analysis and its application
- Yang, Yang, et al.
(Show Context)
Citation Context ...ithm turned out to be more effective than Chen and Yu’s methods, which can extract at most c 1 features. In addition, our LDA algorithm [24] is more powerful and simpler than Liu et al.’s [21] method =-=[52]-=-. The algorithm in the literature [32] can be viewed as a nonlinear generalization of that in [24]. However, the derivation of the algorithm is based on an assumption that the feature space is assumed... |