## Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria (2001)

### Cached

### Download Links

- [www.ph.tn.tudelft.nl]
- [www.isi.uu.nl]
- [www.itu.dk]
- DBLP

### Other Repositories/Bibliography

Venue: | IEEE Transactions on Pattern Analysis and Machine Intelligence |

Citations: | 65 - 4 self |

### BibTeX

@ARTICLE{Loog01multiclasslinear,

author = {Marco Loog and R.P.W. Duin and R. Haeb-Umbach},

title = {Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria},

journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence},

year = {2001},

volume = {23},

pages = {762--766}

}

### Years of Citing Articles

### OpenURL

### Abstract

We derive a class of computationally inexpensive linear dimension reduction criteria by introducing a weighted variant of the well-known K-class Fisher criterion associated with linear discriminant analysis (LDA). It can be seen that LDA weights contributions of individual class pairs according to the Euclidian distance of the respective class means. We generalize upon LDA by introducing a different weighting function.

### Citations

2644 |
Introduction to Statistical Pattern Recognition
- Fukunaga
- 1972
(Show Context)
Citation Context ...s): A transformation matrix from an n-dimensional feature space to a d-dimensional space is determined such that the Fisher criterion of total scatter versus average within-class scatter is maximized =-=[6]-=-. Campbell has shown that the determination of the LDA transform is equivalent to finding the maximum-likelihood (ML) parameter estimates of a Gaussian model, assuming that all class discrimination in... |

149 | Discriminant analysis by Gaussian mixtures - Hastie, Tibshirani - 1996 |

91 | The statistical utilization of multiple measurements - Fisher - 1938 |

86 |
Heteroscedastic discriminant analysis and reduced rank hmms for improved speech recognition
- Kumar, Andreou
- 1998
(Show Context)
Citation Context ...e already been published in [4]. Conclusions are drawn in Section 5. Several alternative approaches to multiclass LDR are known. In some of them, the problem is stated as an ML estimation task, e.g., =-=[9]-=-, [7], in others the divergence is used as a measure for class separation [3]. These criteria, however, also are not directly related to the classification rate. This also holds for the eigenvalue dec... |

55 |
The Utilization of Multiple Measurements in Problems of Biological Classification
- Rao
- 1948
(Show Context)
Citation Context ...oblems related to this. The most well-known technique for linear dimension reduction (LDR) in the K-class problem is linear discriminant analysis (LDA) (Fisher [5] introduced two-class LDA, while Rao =-=[13]-=- generalized LDA to multiple classes): A transformation matrix from an n-dimensional feature space to a d-dimensional space is determined such that the Fisher criterion of total scatter versus average... |

50 | The nonlinear pca learning rule in independent component analysis, Neurocomput - Oja - 1997 |

14 |
Approximate Pairwise Accuracy Criteria for Multiclass Linear Dimension Reduction: Generalisations of the Fisher Criterion
- Loog
- 1999
(Show Context)
Citation Context ... of individual class pairs to the overall criterion in order to improve upon LDA. The weighting scheme discussed in this paper (Section 3) is called the approximate pairwise accuracy criterion (aPAC) =-=[10]-=-: Here, the weighting is derived from an attempt to . M. Loog is with the Image Sciences Institute, University Medical Center Utrecht, PO Box 85500, 3508 GA Utrecht, The Netherlands. E-mail: marco@isi... |

9 | Toward Bayes-optimal linear dimension reduction - Buturović - 1994 |

8 | Feature combinations and the divergence criterion - Decell, Mayekar - 1977 |

5 | Canonical variate analysis—a general model formulation. Australian journal of statistics - Campbell - 1984 |

3 | Feature selection and extraction. In Handbook of pattern recognition and image processing - Kittler - 1986 |

2 |
ªThe Statistical Utilization of Multiple Measurements,º Ann
- Fisher
- 1938
(Show Context)
Citation Context ... to overcome estimation problems, and problems related to this. The most well-known technique for linear dimension reduction (LDR) in the K-class problem is linear discriminant analysis (LDA) (Fisher =-=[5]-=- introduced two-class LDA, while Rao [13] generalized LDA to multiple classes): A transformation matrix from an n-dimensional feature space to a d-dimensional space is determined such that the Fisher ... |

2 |
ªDiscriminant Analysis by Gaussian Mixtures,º
- Hastie, Tibshirani
- 1996
(Show Context)
Citation Context ...eady been published in [4]. Conclusions are drawn in Section 5. Several alternative approaches to multiclass LDR are known. In some of them, the problem is stated as an ML estimation task, e.g., [9], =-=[7]-=-, in others the divergence is used as a measure for class separation [3]. These criteria, however, also are not directly related to the classification rate. This also holds for the eigenvalue decompos... |

2 |
ªThe Nonlinear PCA Learning Rule
- Oja
- 1997
(Show Context)
Citation Context ...rlap problem are usually iterative and, thereby, much more computationally demanding, e.g., the Patrick-Fisher approach described in [8], the nonlinear principal component analysis by neural networks =-=[12]-=-, and the general, nonparametric approach suggested by Buturovic [1]. 2 THE FISHER CRITERION AND ITS NONOPTIMALITY Multiclass LDR is concerned with the search for a linear transformation that reduces ... |

1 |
ªToward Bayes-Optimal Linear Dimension Reduction,º
- Buturovic
- 1994
(Show Context)
Citation Context ...nally demanding, e.g., the Patrick-Fisher approach described in [8], the nonlinear principal component analysis by neural networks [12], and the general, nonparametric approach suggested by Buturovic =-=[1]-=-. 2 THE FISHER CRITERION AND ITS NONOPTIMALITY Multiclass LDR is concerned with the search for a linear transformation that reduces the dimension of a given n-dimensional statistical model, consisting... |

1 |
ªCanonical Variate AnalysisÐA General Model Formulation,º Australian
- Campbell
- 1984
(Show Context)
Citation Context ...del, assuming that all class discrimination information resides in a d-dimensional subspace of the original n-dimensional feature space and that the within-class covariances are equal for all classes =-=[2]-=-. However, for a K-class problem with K>2, the Fisher criterion is clearly suboptimal. This is seen by a decomposition (Section 2) of the K-class Fisher criterion into a sum of 1 2 K…K � 1† two-class ... |

1 |
ªFeature Combinations and the Divergence Criterion,º
- Decell, Mayekar
- 1977
(Show Context)
Citation Context ...alternative approaches to multiclass LDR are known. In some of them, the problem is stated as an ML estimation task, e.g., [9], [7], in others the divergence is used as a measure for class separation =-=[3]-=-. These criteria, however, also are not directly related to the classification rate. This also holds for the eigenvalue decomposition based approach by Young and Odell [14]. Procedures that deal with ... |

1 |
ªMulti-Class Linear Feature Extraction by Nonlinear PCA,º
- Duin, Loog, et al.
- 2000
(Show Context)
Citation Context ...es the LDR approach based on our aPAC with LDA and with a neural network-based approach. Part of the theory was previously reported in [10] and the experimental results have already been published in =-=[4]-=-. Conclusions are drawn in Section 5. Several alternative approaches to multiclass LDR are known. In some of them, the problem is stated as an ML estimation task, e.g., [9], [7], in others the diverge... |

1 |
ªFeature Selection and Extraction,º
- Kittler
- 1986
(Show Context)
Citation Context ...ch by Young and Odell [14]. Procedures that deal with the class overlap problem are usually iterative and, thereby, much more computationally demanding, e.g., the Patrick-Fisher approach described in =-=[8]-=-, the nonlinear principal component analysis by neural networks [12], and the general, nonparametric approach suggested by Buturovic [1]. 2 THE FISHER CRITERION AND ITS NONOPTIMALITY Multiclass LDR is... |

1 |
ªA Formulation and Comparison of Two Linear Feature Selection Techniques Applicable to Statistical Classification,º
- Young, Odell
- 1984
(Show Context)
Citation Context ...measure for class separation [3]. These criteria, however, also are not directly related to the classification rate. This also holds for the eigenvalue decomposition based approach by Young and Odell =-=[14]-=-. Procedures that deal with the class overlap problem are usually iterative and, thereby, much more computationally demanding, e.g., the Patrick-Fisher approach described in [8], the nonlinear princip... |

1 | Multi-class linear feature extraction by nonlinear PCA - Duin, Loog, et al. - 2000 |