## Classification via semi-Riemannian spaces (2008)

### Cached

### Download Links

- [research.microsoft.com]
- [mmlab.ie.cuhk.edu.hk]
- [mplab.ucsd.edu]
- [mplab.ucsd.edu]
- DBLP

### Other Repositories/Bibliography

Venue: | in Proc. IEEE Conf. on Computer Vision and Pattern Recognition |

Citations: | 3 - 2 self |

### BibTeX

@INPROCEEDINGS{Zhao08classificationvia,

author = {Deli Zhao and Zhouchen Lin and Xiaoou Tang},

title = {Classification via semi-Riemannian spaces},

booktitle = {in Proc. IEEE Conf. on Computer Vision and Pattern Recognition},

year = {2008}

}

### OpenURL

### Abstract

In this paper, we develop a geometric framework for linear or nonlinear discriminant subspace learning and classification. In our framework, the structures of classes are conceptualized as a semi-Riemannian manifold which is considered as a submanifold embedded in an ambient semi-Riemannian space. The class structures of original samples can be characterized and deformed by local metrics of the semi-Riemannian space. Semi-Riemannian metrics are uniquely determined by the smoothing of discrete functions and the nullity of the semi-Riemannian space. Based on the geometrization of class structures, optimizing class structures in the feature space is equivalent to maximizing the quadratic quantities of metric tensors in the semi-Riemannian space. Thus supervised discriminant subspace learning reduces to unsupervised semi-Riemannian manifold learning. Based on the proposed framework, a novel algorithm, dubbed as Semi-Riemannian Discriminant Analysis (SRDA), is presented for subspace-based classification. The performance of SRDA is tested on face recognition (singular case) and handwritten capital letter classification (nonsingular case) against existing algorithms. The experimental results show that SRDA works well on recognition and classification, implying that semi-Riemannian geometry is a promising new tool for pattern recognition and machine learning. 1.

### Citations

5327 |
Matrix Analysis
- Horn, Johnson
- 1985
(Show Context)
Citation Context ... is the feasible solution of the metric GN i that is favorable of discrimination. In this paper, we assume that the feature space is Euclidean, meaning that the length of y is measured by the ℓ2 norm =-=[12]-=-: ‖y‖2 ℓ2 = yTy = tr(yyT ). 3.2.1 Alignment of Metric Tensors in SemiRiemannian Space Suppose that the metric matrix G N i at xi has already been determined. If we penalize the feature space S d y usi... |

2415 |
The Elements of Statistical Learning
- Hastie, Tibshirani, et al.
- 2001
(Show Context)
Citation Context ... − xi‖S j n, j = 1, . . . , c. (4) x The distance ‖¯xj − xi‖Sn depends on the attributes of x the sample space Sn x. It may be the Euclidean distance, one of statistical distances like the Chi-square =-=[10]-=-, or the approximated geodesic distance [23]. It suffices to emphasize that the original motivation of the definition of KNN classes comes from the surprising effectiveness of discriminant subspaces l... |

1853 |
JB: A global geometric framework for nonlinear dimensionality reduction
- Tenenbaum
(Show Context)
Citation Context ... paid to investigating class structures since Fisher’s LDA. Most works for subspace-based classification can be traced back to LDA and Fisher criterion. Recently, the development of manifold learning =-=[23, 22]-=- leads researchers’ attention to the investigation of local structures of data in the pattern recognition community. Such kind of analysis is necessary in cases where data structures are complex. Line... |

1787 | Nonlinear dimensionality reduction by locally linear embedding." Science 290.5500
- Roweis, Saul
- 2000
(Show Context)
Citation Context ... paid to investigating class structures since Fisher’s LDA. Most works for subspace-based classification can be traced back to LDA and Fisher criterion. Recently, the development of manifold learning =-=[23, 22]-=- leads researchers’ attention to the investigation of local structures of data in the pattern recognition community. Such kind of analysis is necessary in cases where data structures are complex. Line... |

1725 | Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection
- Belhumeur, Hespanha, et al.
- 1997
(Show Context)
Citation Context ...e types of approaches: 1) the regularization of the within-class covariance matrix such as the work in [8, 9], 2) Principal Component Analysis (PCA) based dimensionality reduction such as Fisherfaces =-=[2]-=-, and 3) subspace-based variants of LDA such as [2, 5, 34, 29, 32, 28, 30]. There are also matrix-decomposition-based approaches like [13, 33] and the correlation-based methods such as [17]. However, ... |

348 | Regularized discriminant analysis
- Friedman
- 1989
(Show Context)
Citation Context ...ination has been devoted towards tackling the singularity problem. Overall, there are mainly three types of approaches: 1) the regularization of the within-class covariance matrix such as the work in =-=[8, 9]-=-, 2) Principal Component Analysis (PCA) based dimensionality reduction such as Fisherfaces [2], and 3) subspace-based variants of LDA such as [2, 5, 34, 29, 32, 28, 30]. There are also matrix-decompos... |

344 |
Semi-Riemannian Geometry. With Applications to Relativity
- O’Neill
- 1983
(Show Context)
Citation Context ...rs are fond due to its simplicity, principled treatment, and comparable performance. We devote this paper to addressing the linear classification issue from the perspective of semiRiemannian geometry =-=[18]-=-. ∗ The work was performed when Deli Zhao worked in Microsoft Research Asia. 1.1. Fisher Criterion and Discrepancy Criterion Fisher’s Linear Discriminant Analysis (LDA) [7] is well known as the classi... |

231 | Face recognition using laplacianfaces
- He, Yan, et al.
- 2005
(Show Context)
Citation Context ...ern recognition community. Such kind of analysis is necessary in cases where data structures are complex. Linear methods related to manifold learning have been proposed for subspace-based recognition =-=[11, 31]-=-. Another recent development on linear discrimination is that discrepancy criterions took the role of integrating (global or local) between-class scatters and (global or local) within-class scatters i... |

209 | A direct LDA algorithm for high dimensional data-with application to face recognition, Pattern Recognition 34
- Yu, Yang
- 2001
(Show Context)
Citation Context ... within-class covariance matrix such as the work in [8, 9], 2) Principal Component Analysis (PCA) based dimensionality reduction such as Fisherfaces [2], and 3) subspace-based variants of LDA such as =-=[2, 5, 34, 29, 32, 28, 30]-=-. There are also matrix-decomposition-based approaches like [13, 33] and the correlation-based methods such as [17]. However, less attention has been paid to investigating class structures since Fishe... |

178 |
A new LDA-based face recognition system which can solve the small sample size problem
- Chen, Liao, et al.
- 2000
(Show Context)
Citation Context ... within-class covariance matrix such as the work in [8, 9], 2) Principal Component Analysis (PCA) based dimensionality reduction such as Fisherfaces [2], and 3) subspace-based variants of LDA such as =-=[2, 5, 34, 29, 32, 28, 30]-=-. There are also matrix-decomposition-based approaches like [13, 33] and the correlation-based methods such as [17]. However, less attention has been paid to investigating class structures since Fishe... |

156 | and R.Tibshirani, Penalized discriminant analysis
- Hastie
- 1995
(Show Context)
Citation Context ...ination has been devoted towards tackling the singularity problem. Overall, there are mainly three types of approaches: 1) the regularization of the within-class covariance matrix such as the work in =-=[8, 9]-=-, 2) Principal Component Analysis (PCA) based dimensionality reduction such as Fisherfaces [2], and 3) subspace-based variants of LDA such as [2, 5, 34, 29, 32, 28, 30]. There are also matrix-decompos... |

109 | Graph embedding and extensions: a general framework for dimensionality reduction
- Yan, Xu, et al.
- 2007
(Show Context)
Citation Context ...ern recognition community. Such kind of analysis is necessary in cases where data structures are complex. Linear methods related to manifold learning have been proposed for subspace-based recognition =-=[11, 31]-=-. Another recent development on linear discrimination is that discrepancy criterions took the role of integrating (global or local) between-class scatters and (global or local) within-class scatters i... |

102 |
The statistical utilization of multiple measurements
- Fisher
- 1938
(Show Context)
Citation Context ...of semiRiemannian geometry [18]. ∗ The work was performed when Deli Zhao worked in Microsoft Research Asia. 1.1. Fisher Criterion and Discrepancy Criterion Fisher’s Linear Discriminant Analysis (LDA) =-=[7]-=- is well known as the classic work on discriminant analysis. Fisher performed the structural analysis of classes by maximizing the between-class scatter and simultaneously minimizing the within-class ... |

80 | KPCA Plus LDA: A Complete Kernel Fisher Discriminant Framework for Feature Extraction and Recognition
- Yang, Frangi, et al.
- 2005
(Show Context)
Citation Context ... within-class covariance matrix such as the work in [8, 9], 2) Principal Component Analysis (PCA) based dimensionality reduction such as Fisherfaces [2], and 3) subspace-based variants of LDA such as =-=[2, 5, 34, 29, 32, 28, 30]-=-. There are also matrix-decomposition-based approaches like [13, 33] and the correlation-based methods such as [17]. However, less attention has been paid to investigating class structures since Fishe... |

69 | A unified framework for subspace face recognition
- Wang, Tang
- 2004
(Show Context)
Citation Context |

57 | Generalizing Discriminant Analysis Using the Generalized Singular Value Decomposition
- Howland, Park
- 2004
(Show Context)
Citation Context ...lysis (PCA) based dimensionality reduction such as Fisherfaces [2], and 3) subspace-based variants of LDA such as [2, 5, 34, 29, 32, 28, 30]. There are also matrix-decomposition-based approaches like =-=[13, 33]-=- and the correlation-based methods such as [17]. However, less attention has been paid to investigating class structures since Fisher’s LDA. Most works for subspace-based classification can be traced ... |

56 | Dual-space linear discriminant analysis for face recognition,” CVPR
- Wang, Tang
- 2004
(Show Context)
Citation Context |

51 | Efficient and Robust Feature Extraction by Maximum Margin Criterion
- Li, Zang
- 2004
(Show Context)
Citation Context ...ing (global or local) between-class scatters and (global or local) within-class scatters instead of ratios like the traditional Fisher criterion. Global methods include Maximum Margin Criterion (MMC) =-=[14]-=- and Kernel Scatter-Difference Analysis (KSDA) [15, 16], and local ones include Stepwise Non978-1-4244-2243-2/08/$25.00 ©2008 IEEEparametric Maximum Margin Criterion (SNMMC) [21], Local and Weighted ... |

39 |
Lightlike Submanifolds of Semi-Riemannian Manifolds and Applications
- Duggal, Bejancu
- 1996
(Show Context)
Citation Context ...es in physics. To the best of our knowledge, however, it has not been explicitly applied in pattern recognition before. Here we give a concise introduction to semi-Riemannian spaces. One may refer to =-=[18, 6]-=- for more details. Geometric spaces are specified by their metrics. The metric matrix in the semi-Riemannian space Nn ν is of form [ ˇΛp×p 0 G = 0 −ˆ ] , (1) Λν×ν where ˇ Λp×p and ˆ Λν×ν are diagonal ... |

37 | Where Are Linear Feature Extraction Methods Applicable
- Martínez, Zhu
- 2005
(Show Context)
Citation Context ...Fisherfaces [2], and 3) subspace-based variants of LDA such as [2, 5, 34, 29, 32, 28, 30]. There are also matrix-decomposition-based approaches like [13, 33] and the correlation-based methods such as =-=[17]-=-. However, less attention has been paid to investigating class structures since Fisher’s LDA. Most works for subspace-based classification can be traced back to LDA and Fisher criterion. Recently, the... |

37 | X.: United Subspace Analysis for Face Recognition
- Wang, Tang
- 2003
(Show Context)
Citation Context ...of the definition of KNN classes comes from the surprising effectiveness of discriminant subspaces learnt only from several nearest neighbor classes of a query sample in some resulting feature spaces =-=[27]-=- 2 . Readers may refer to [27] for more details. 3.1. Modeling Class Structures as a SemiRiemannian Submanifold 3.1.1 Associating Class Structures with a SemiRiemannian Manifold First, let us introduc... |

34 | Random Sampling for Subspace Face Recognition
- Wang, Tang
- 2006
(Show Context)
Citation Context |

30 | Spectral regression for efficient regularized subspace learning
- Cai, He, et al.
(Show Context)
Citation Context ...d-column eigenvectors of L corresponding to the first d largest eigenvalues. This type of nonlinear embedding can be exploited for class visualization and the efficient computation of linear subspace =-=[4]-=-. If there is a linear isometric transformation between the low-dimensional feature vector y and the original sample x, i.e., y ↦→ Uy = x, where UTU = Id×d, then the linear discriminant subspace U can... |

29 | A Two-Stage Linear Discriminant Analysis via QR-Decomposition
- Ye, Li
(Show Context)
Citation Context ...lysis (PCA) based dimensionality reduction such as Fisherfaces [2], and 3) subspace-based variants of LDA such as [2, 5, 34, 29, 32, 28, 30]. There are also matrix-decomposition-based approaches like =-=[13, 33]-=- and the correlation-based methods such as [17]. However, less attention has been paid to investigating class structures since Fisher’s LDA. Most works for subspace-based classification can be traced ... |

27 | Learning a Spatially Smooth Subspace for Face Recognition,” CVPR
- Cai, He, et al.
- 2007
(Show Context)
Citation Context ...re ˇ F is the first-order difference operator ˇF = [I (K ˇ K−1)×(K ˇ K−1) 0 (K ˇ K−1)×1]+ (11) [0 (K ˇ K−1)×1 − I (K ˇ K−1)×(K ˇ K−1)]. (12) Note that ˇ FTFˇ is the Neuman discretization of Laplacian =-=[19, 3]-=-. B. Setting N K ˇ K+ ˆ K ˆK Locally Null. Null (or light-like) manifolds are typical examples in semi-Riemannian spaces [6]. In classification, a null manifold has its physical nature in its own righ... |

21 | Trace ratio vs. ratio trace for dimensionality reduction
- Wang, Yan, et al.
(Show Context)
Citation Context ... (SNMMC) [21], Local and Weighted Maximum Margin Discriminant Analysis (LWMMDA) [26], and Average Neighborhood Margin Maximization (ANMM) [24]. A discrepancy criterion is also implicitly contained in =-=[25]-=-. Such kinds of methods successfully avoid the generalized eigen-decomposition problem, thereby are free from the computational dilemma of singularity. 1.2. Our Work 1.2.1 From Data to Semi-Riemannian... |

14 |
Discretized Laplacian smoothing by Fourier methods
- O’Sullivan
- 1991
(Show Context)
Citation Context ...re ˇ F is the first-order difference operator ˇF = [I (K ˇ K−1)×(K ˇ K−1) 0 (K ˇ K−1)×1]+ (11) [0 (K ˇ K−1)×1 − I (K ˇ K−1)×(K ˇ K−1)]. (12) Note that ˇ FTFˇ is the Neuman discretization of Laplacian =-=[19, 3]-=-. B. Setting N K ˇ K+ ˆ K ˆK Locally Null. Null (or light-like) manifolds are typical examples in semi-Riemannian spaces [6]. In classification, a null manifold has its physical nature in its own righ... |

11 |
Overview of the Face Recognition Grand Challenge
- Philips, Flynn, et al.
- 2005
(Show Context)
Citation Context ...yed on extracted features for recognition and classification. 4.1. Singular Case: Face Recognition We perform the experiments on a subset selected from the query set of experiment 4 in FRGC version 2 =-=[20]-=-. The facial data set was used in [36]. There are 200 subjects in the gallery and probe set and 116 subjects in the training set. There are ten facial images for each subject. The identities of subjec... |

11 |
Feature extraction by maximizing the average neighborhood margin
- Wang, Zhang
(Show Context)
Citation Context ...243-2/08/$25.00 ©2008 IEEEparametric Maximum Margin Criterion (SNMMC) [21], Local and Weighted Maximum Margin Discriminant Analysis (LWMMDA) [26], and Average Neighborhood Margin Maximization (ANMM) =-=[24]-=-. A discrepancy criterion is also implicitly contained in [25]. Such kinds of methods successfully avoid the generalized eigen-decomposition problem, thereby are free from the computational dilemma of... |

11 | Linear Laplacian Discrimination for Feature Extraction
- Zhao, Liu, et al.
- 2007
(Show Context)
Citation Context ...ion and classification. 4.1. Singular Case: Face Recognition We perform the experiments on a subset selected from the query set of experiment 4 in FRGC version 2 [20]. The facial data set was used in =-=[36]-=-. There are 200 subjects in the gallery and probe set and 116 subjects in the training set. There are ten facial images for each subject. The identities of subjects in the training set is different fr... |

10 |
Face recognition using kernel scatter-difference-based discriminant analysis
- Liu, Tang, et al.
- 2006
(Show Context)
Citation Context ...lobal or local) within-class scatters instead of ratios like the traditional Fisher criterion. Global methods include Maximum Margin Criterion (MMC) [14] and Kernel Scatter-Difference Analysis (KSDA) =-=[15, 16]-=-, and local ones include Stepwise Non978-1-4244-2243-2/08/$25.00 ©2008 IEEEparametric Maximum Margin Criterion (SNMMC) [21], Local and Weighted Maximum Margin Discriminant Analysis (LWMMDA) [26], and... |

6 | Laplacian PCA and its applications
- Zhao, Lin, et al.
(Show Context)
Citation Context ...r g(dyi,dyi). The maximization of g(dyi,dyi) is in effect the principal component analysis in N K ˇ K+ ˆ K ˆK . We may handle the maximization by taking advantage of Zhao et al.’s theoretic framework =-=[35]-=- on the alignment of local geometry. More specifically, let the difference operator D be D = [ I(K K+ ˇ K)×(K ˆ K+ ˇ K) ˆ −e T K ˇ K+ ˆ K ] . (8) Then we have the following theorem pertaining to the m... |

4 |
Kernel scatter-difference based discriminant analysis for face recognition
- Liu, Tang
- 2004
(Show Context)
Citation Context ...lobal or local) within-class scatters instead of ratios like the traditional Fisher criterion. Global methods include Maximum Margin Criterion (MMC) [14] and Kernel Scatter-Difference Analysis (KSDA) =-=[15, 16]-=-, and local ones include Stepwise Non978-1-4244-2243-2/08/$25.00 ©2008 IEEEparametric Maximum Margin Criterion (SNMMC) [21], Local and Weighted Maximum Margin Discriminant Analysis (LWMMDA) [26], and... |

4 |
Weighted maximum margin discriminant analysis with kernels
- Zheng, Zou, et al.
- 2005
(Show Context)
Citation Context ... [15, 16], and local ones include Stepwise Non978-1-4244-2243-2/08/$25.00 ©2008 IEEEparametric Maximum Margin Criterion (SNMMC) [21], Local and Weighted Maximum Margin Discriminant Analysis (LWMMDA) =-=[26]-=-, and Average Neighborhood Margin Maximization (ANMM) [24]. A discrepancy criterion is also implicitly contained in [25]. Such kinds of methods successfully avoid the generalized eigen-decomposition p... |

3 |
Face decription with local binary patterns: application to face recognition
- Ahonen, Hadid, et al.
(Show Context)
Citation Context ... and the remaining for the probe set. Such a trial is repeated 20 times. We apply the Local Binary Pattern (LBP) algorithm to extract visual features. The usage of LBP here is consistent with that in =-=[1]-=-: pattern (8,2), 59 bins, and 7 × 7 image blocks. For PCA-combined methods, the number of principal components is optimally determined. Besides, for LPP, the number of nearest neighbors is chosen as 3... |

1 |
Face recognition by stepwise nonparameteric margin maximum criterion
- Qiu, Wu
(Show Context)
Citation Context ...rgin Criterion (MMC) [14] and Kernel Scatter-Difference Analysis (KSDA) [15, 16], and local ones include Stepwise Non978-1-4244-2243-2/08/$25.00 ©2008 IEEEparametric Maximum Margin Criterion (SNMMC) =-=[21]-=-, Local and Weighted Maximum Margin Discriminant Analysis (LWMMDA) [26], and Average Neighborhood Margin Maximization (ANMM) [24]. A discrepancy criterion is also implicitly contained in [25]. Such ki... |