Results 1 
7 of
7
Linear dimensionality reduction via a heteroscedastic extension of lda: The chernoff criterion
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2004
"... Abstract—We propose an eigenvectorbased heteroscedastic linear dimension reduction (LDR) technique for multiclass data. The technique is based on a heteroscedastic twoclass technique which utilizes the socalled Chernoff criterion, and successfully extends the wellknown linear discriminant analys ..."
Abstract

Cited by 90 (0 self)
 Add to MetaCart
(Show Context)
Abstract—We propose an eigenvectorbased heteroscedastic linear dimension reduction (LDR) technique for multiclass data. The technique is based on a heteroscedastic twoclass technique which utilizes the socalled Chernoff criterion, and successfully extends the wellknown linear discriminant analysis (LDA). The latter, which is based on the Fisher criterion, is incapable of dealing with heteroscedastic data in a proper way. For the twoclass case, the betweenclass scatter is generalized so to capture differences in (co)variances. It is shown that the classical notion of betweenclass scatter can be associated with Euclidean distances between class means. From this viewpoint, the betweenclass scatter is generalized by employing the Chernoff distance measure, leading to our proposed heteroscedastic measure. Finally, using the results from the twoclass case, a multiclass extension of the Chernoff criterion is proposed. This criterion combines separation information present in the class mean as well as the class covariance matrices. Extensive experiments and a comparison with similar dimension reduction techniques are presented. Index Terms—Linear dimension reduction, linear discriminant analysis, Fisher criterion, Chernoff distance, Chernoff criterion. 1
On the Performance of Chernoffdistancebased Linear Dimensionality Reduction Techniques
"... Abstract. We present a performance analysis of three linear dimensionality reduction techniques: Fisher’s discriminant analysis (FDA), and two methods introduced recently based on the Chernoff distance between two distributions, the Loog and Duin (LD) method, which aims to maximize a criterion deriv ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We present a performance analysis of three linear dimensionality reduction techniques: Fisher’s discriminant analysis (FDA), and two methods introduced recently based on the Chernoff distance between two distributions, the Loog and Duin (LD) method, which aims to maximize a criterion derived from the Chernoff distance in the original space, and the one introduced by Rueda and Herrera (RH), which aims to maximize the Chernoff distance in the transformed space. A comprehensive performance analysis of these methods combined with two wellknown classifiers, linear and quadratic, on synthetic and reallife data shows that LD and RH outperform FDA, specially in the quadratic classifier, which is strongly related to the Chernoff distance in the transformed space. In the case of the linear classifier, the superiority of RH over the other two methods is also demonstrated. 1
A Theoretical Comparison of Twoclass Fisher’s and Heteroscedastic Linear Dimensionality Reduction Schemes
"... We present a theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques for two classes, a homoscedastic LDR scheme, Fisher’s discriminant (FD), and a heteroscedastic LDR scheme, LoogDuin (LD). We formalize the necessary and sufficient conditions for which the FD and LD ..."
Abstract
 Add to MetaCart
(Show Context)
We present a theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques for two classes, a homoscedastic LDR scheme, Fisher’s discriminant (FD), and a heteroscedastic LDR scheme, LoogDuin (LD). We formalize the necessary and sufficient conditions for which the FD and LD criteria are maximized for the same linear transformation. To derive these conditions, we first show that the two criteria preserve the same maximum values after a diagonalization process is applied. We derive the necessary and sufficient conditions for various cases, including coincident covariance matrices, coincident prior probabilities, and for when one of the covariances is the identity matrix. We empirically show that the conditions are statistically related to the classification error for a postprocessing onedimensional quadratic classifier and the Chernoff distance in the transformed space.
A Theoretical Comparison of Two Linear Dimensionality Reduction Techniques
"... Abstract. A theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques, namely Fisher’s discriminant (FD) and LoogDuin (LD) dimensionality reduciton, is presented. The necessary and sufficient conditions for which FD and LD provide the same linear transformation are dis ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. A theoretical analysis for comparing two linear dimensionality reduction (LDR) techniques, namely Fisher’s discriminant (FD) and LoogDuin (LD) dimensionality reduciton, is presented. The necessary and sufficient conditions for which FD and LD provide the same linear transformation are discussed and proved. To derive these conditions, it is first shown that the two criteria preserve the same maximum value after a diagonalization process is applied, and then the necessary and sufficient conditions for various cases, including coincident covariance matrices, coincident prior probabilities, and for when one of the covariances is the identity matrix. A measure for comparing the two criteria is derived from the necessary and sufficient conditions, and used to empirically show that the conditions are statistically related to the classification error for a postprocessing quadratic classifier and the Chernoff distance in the transformed space. 1
LINEARLY
"... Abstract—We propose an eigenvectorbased heteroscedastic linear dimension reduction (LDR) technique for multiclass data. The technique is based on a heteroscedastic twoclass technique which utilizes the socalled Chernoff criterion, and successfully extends the wellknown linear discriminant analys ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—We propose an eigenvectorbased heteroscedastic linear dimension reduction (LDR) technique for multiclass data. The technique is based on a heteroscedastic twoclass technique which utilizes the socalled Chernoff criterion, and successfully extends the wellknown linear discriminant analysis (LDA). The latter, which is based on the Fisher criterion, is incapable of dealing with heteroscedastic data in a proper way. For the twoclass case, the betweenclass scatter is generalized so to capture differences in (co)variances. It is shown that the classical notion of betweenclass scatter can be associated with Euclidean distances between class means. From this viewpoint, the betweenclass scatter is generalized by employing the Chernoff distance measure, leading to our proposed heteroscedastic measure. Finally, using the results from the twoclass case, a multiclass extension of the Chernoff criterion is proposed. This criterion combines separation information present in the class mean as well as the class covariance matrices. Extensive experiments and a comparison with similar dimension reduction techniques are presented.
Improved LDA by using Distributing Distances and Boundary Patterns
"... Abstract One of the statistical methods of class discriminant is linear discriminant analysis. This method, by using statistical parameters, obtain a space which by using available discriminating information among class means does classification act. By using distributing Distances, extended analys ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract One of the statistical methods of class discriminant is linear discriminant analysis. This method, by using statistical parameters, obtain a space which by using available discriminating information among class means does classification act. By using distributing Distances, extended analysis linear discriminant to its heteroscedastic state. At this state,to make classes more separating of available separating information among covariance matrix classes including classes mean is using. In this article,because of using new scattering matrices which are defined based on boundary and non boundary patterns, classes overlapping in Spaces which obtains has been reduced. On the other hand,using new scattering matrices brings about increasing classification rate so, the done experiments confirm improvement of classification rate.