Results 1  10
of
25
Characterization of a family of algorithms for generalized discriminant analysis on undersampled problems
 Journal of Machine Learning Research
, 2005
"... A generalized discriminant analysis based on a new optimization criterion is presented. The criterion extends the optimization criteria of the classical Linear Discriminant Analysis (LDA) when the scatter matrices are singular. An efficient algorithm for the new optimization problem is presented. Th ..."
Abstract

Cited by 50 (11 self)
 Add to MetaCart
A generalized discriminant analysis based on a new optimization criterion is presented. The criterion extends the optimization criteria of the classical Linear Discriminant Analysis (LDA) when the scatter matrices are singular. An efficient algorithm for the new optimization problem is presented. The solutions to the proposed criterion form a family of algorithms for generalized LDA, which can be characterized in a closed form. We study two specific algorithms, namely Uncorrelated LDA (ULDA) and Orthogonal LDA (OLDA). ULDA was previously proposed for feature extraction and dimension reduction, whereas OLDA is a novel algorithm proposed in this paper. The features in the reduced space of ULDA are uncorrelated, while the discriminant vectors of OLDA are orthogonal to each other. We have conducted a comparative study on a variety of realworld data sets to evaluate ULDA and OLDA in terms of classification accuracy.
Orthogonal laplacianfaces for face recognition
 IEEE Trans. Image Process
, 2006
"... [30] V. Patrascu and V. Buzuloiu, “Image dynamic range enhancement in ..."
Abstract

Cited by 42 (2 self)
 Add to MetaCart
[30] V. Patrascu and V. Buzuloiu, “Image dynamic range enhancement in
Discriminant Embedding for Local Image Descriptors
"... Invariant feature descriptors such as SIFT and GLOH have been demonstrated to be very robust for image matching and visual recognition. However, such descriptors are generally parameterised in very high dimensional spaces e.g. 128 dimensions in the case of SIFT. This limits the performance of featur ..."
Abstract

Cited by 41 (4 self)
 Add to MetaCart
Invariant feature descriptors such as SIFT and GLOH have been demonstrated to be very robust for image matching and visual recognition. However, such descriptors are generally parameterised in very high dimensional spaces e.g. 128 dimensions in the case of SIFT. This limits the performance of feature matching techniques in terms of speed and scalability. Furthermore, these descriptors have traditionally been carefully hand crafted by manually tuning many parameters. In this paper, we tackle both of these problems by formulating descriptor design as a nonparametric dimensionality reduction problem. In contrast to previous approaches that use only the global statistics of the inputs, we adopt a discriminative approach. Starting from a large training set of labelled match/nonmatch pairs, we pursue lower dimensional embeddings that are optimised for their discriminative power. Extensive comparative experiments demonstrate that we can exceed the performance of the current state of the art techniques such as SIFT with far fewer dimensions, and with virtually no parameters to be tuned by hand.
Computational and Theoretical Analysis of Null Space and Orthogonal Linear Discriminant Analysis
 JOURNAL OF MACHINE LEARNING RESEARCH 7 (2006) 11831204
, 2006
"... Dimensionality reduction is an important preprocessing step in many applications. Linear discriminant analysis (LDA) is a classical statistical approach for supervised dimensionality reduction. It aims to maximize the ratio of the betweenclass distance to the withinclass distance, thus maximizi ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
Dimensionality reduction is an important preprocessing step in many applications. Linear discriminant analysis (LDA) is a classical statistical approach for supervised dimensionality reduction. It aims to maximize the ratio of the betweenclass distance to the withinclass distance, thus maximizing the class discrimination. It has been used widely in many applications. However, the classical LDA formulation requires the nonsingularity of the scatter matrices involved. For undersampled problems, where the data dimensionality is much larger than the sample size, all scatter matrices are singular and classical LDA fails. Many extensions, including null space LDA (NLDA) and orthogonal LDA (OLDA), have been proposed in the past to overcome this problem. NLDA aims to maximize the betweenclass distance in the null space of the withinclass scatter matrix, while OLDA computes a set of orthogonal discriminant vectors via the simultaneous diagonalization of the scatter matrices. They have been applied successfully in various applications. In this
Face recognition using discriminatively trained orthogonal rank one tensor projections
 In Proc. CVPR
, 2007
"... We propose a method for face recognition based on a discriminative linear projection. In this formulation images are treated as tensors, rather than the more conventional vector of pixels. Projections are pursued sequentially and take the form of a rank one tensor, i.e., a tensor which is the outer ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
We propose a method for face recognition based on a discriminative linear projection. In this formulation images are treated as tensors, rather than the more conventional vector of pixels. Projections are pursued sequentially and take the form of a rank one tensor, i.e., a tensor which is the outer product of a set of vectors. A novel and effective technique is proposed to ensure that the rank one tensor projections are orthogonal to one another. These constraints on the tensor projections provide a strong inductive bias and result in better generalization on small training sets. Our work is related to spectrum methods, which achieve orthogonal rank one projections by pursuing consecutive projections in the complement space of previous projections. Although this may be meaningful for applications such as reconstruction, it is less meaningful for pursuing discriminant projections. Our new scheme iteratively solves an eigenvalue problem with orthogonality constraints on one dimension, and solves unconstrained eigenvalue problems on the other dimensions. Experiments demonstrate that on small and medium sized face recognition datasets, this approach outperforms previous embedding methods. On large face datasets this approach achieves results comparable with the best, often using fewer discriminant projections. 1.
Feature reduction via generalized uncorrelated linear discriminant analysis
 IEEE Trans. Knowl. Data Eng
, 2006
"... Highdimensional data appear in many applications of data mining, machine learning, and bioinformatics. Feature reduction is commonly applied as a preprocessing step to overcome the curse of dimensionality. Uncorrelated Linear Discriminant Analysis (ULDA) was recently proposed for feature reduction. ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Highdimensional data appear in many applications of data mining, machine learning, and bioinformatics. Feature reduction is commonly applied as a preprocessing step to overcome the curse of dimensionality. Uncorrelated Linear Discriminant Analysis (ULDA) was recently proposed for feature reduction. The extracted features via ULDA were shown to be statistically uncorrelated, which is desirable for many applications. In this paper, an algorithm called ULDA/QR is proposed to simplify the previous implementation of ULDA. Then the ULDA/GSVD algorithm is proposed based on a novel optimization criterion, to address the singularity problem which occurs in undersampled problems, where the data dimension is larger than the data size. The criterion used is the regularized version of the one in ULDA/QR. Surprisingly, our theoretical result shows that the solution to ULDA/GSVD is independent of the value of the regularization parameter. Experimental results on various types of datasets are
Null space versus orthogonal linear discriminant analysis
 Proc. Int’l Conf. Machine Learning
, 2006
"... Dimensionality reduction is an important preprocessing step for many applications. Linear Discriminant Analysis (LDA) is one of the well known methods for supervised dimensionality reduction. However, the classical LDA formulation requires the nonsingularity of scatter matrices involved. For unders ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Dimensionality reduction is an important preprocessing step for many applications. Linear Discriminant Analysis (LDA) is one of the well known methods for supervised dimensionality reduction. However, the classical LDA formulation requires the nonsingularity of scatter matrices involved. For undersampled problems, where the data dimension is much larger than the sample size, all scatter matrices are singular and classical LDA fails. Many extensions, including null space based LDA (NLDA), orthogonal LDA (OLDA), etc, have been proposed in the past to overcome this problem. In this paper, we present a computational and theoretical analysis of NLDA and OLDA. Our main result shows that under a mild condition which holds in many applications involving highdimensional data, NLDA is equivalent to OLDA. We have performed extensive experiments on various types of data and results are consistent with our theoretical analysis. The presented analysis and experimental results provide further insight into several LDA based algorithms. 1.
Measuring playlist diversity for recommendation systems
 In Proceedings of the 1st ACM Workshop on Audio and Music Computing Multimedia
, 2006
"... We describe a way to measure the diversity of consumer’s musical interests and characterize this diversity using published musical playlists. For each song in the playlist we calculate a set of features, which were optimized for genre recognition, and represent the song as a single point in a multid ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We describe a way to measure the diversity of consumer’s musical interests and characterize this diversity using published musical playlists. For each song in the playlist we calculate a set of features, which were optimized for genre recognition, and represent the song as a single point in a multidimensional genrespace. Given the points for a set of songs, we fit an ellipsoid to the data, and then describe the diversity of the playlist by calculating the volume of the enclosing ellipsoid. We compare 887 different playlists, representing nearly 29,000 distinct songs, to collections of different genres and to the size of our entire database. Playlists tend to be less diverse than a genre, and, by our measure, about 5 orders of magnitude smaller than the entire song set. These characteristics are important for recommendation systems, which want to present users with a set of recommendations tuned to each user’s diversity.
The Performance Of Statistical Pattern Recognition Methods In High Dimensional Settings
 IEEE Signal Processing Workshop on Higher Order Statistics. Ceasarea
, 1994
"... We report on an extensive simulation study comparing eight statistical classification methods, focusing on problems where the number of observations is less than the number of variables. Using a wide range of artificial and real data, two types of classifiers were contrasted; methods that classify u ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We report on an extensive simulation study comparing eight statistical classification methods, focusing on problems where the number of observations is less than the number of variables. Using a wide range of artificial and real data, two types of classifiers were contrasted; methods that classify using all variables, and methods that first reduce the number of dimensions to two or three. The full feature space methods include linear, quadratic and regularized discriminant analysis, and the nearest neighbour method. The four dimensionality reducing classifiers are characterized by the transform they implement. The four transforms compared are the Fisher discriminant plane, the FisherFukunagaKoonz, the Fisherradius, and the Fishervariance transforms. The FisherFukunaga and the Fisherradius transform based classifiers have recently been proposed for two class classification problems. We also present an extension to these transforms such that they can be applied to classification pro...