• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 10,210
Next 10 →

Unsupervised Speaker Clustering in a Linear Discriminant Subspace

by Theodoris Giannakopoulos, Sergios Petridis
"... Abstract—We present an approach for grouping single-speaker speech segments into speaker-specific clusters. Our approach is based on applying the K-means clustering algorithm to a suitable discriminant subspace, where the euclidean distance reflect speaker differences. A core feature of our approach ..."
Abstract - Add to MetaCart
Abstract—We present an approach for grouping single-speaker speech segments into speaker-specific clusters. Our approach is based on applying the K-means clustering algorithm to a suitable discriminant subspace, where the euclidean distance reflect speaker differences. A core feature of our

A Nonparametric Statistical Comparison of Principal Component and Linear Discriminant Subspaces for Face Recognition

by J. Ross Beveridge, Kai She, Bruce A. Draper, Geof H. Givens - In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition , 2001
"... The FERET evaluation compared recognition rates for different semi-automated and automated face recognition algorithms. We extend FERET by considering when differences in recognition rates are statistically distinguishable subject to changes in test imagery. Nearest Neighbor classifiers using princi ..."
Abstract - Cited by 72 (11 self) - Add to MetaCart
principal component and linear discriminant subspaces are compared using different choices of distance metric. Probability distributions for algoriithm recognition rates and pairwise differences in recognition rates are determined using a permutation methodology. The principal component subspace

publications, “ A Nonparametric Statistical Comparison of Principal Component and Linear Discriminant Subspaces for Face Recognition ” presented at CVPR 2001 and

by J. Ross Beveridge, Kai She
"... This short paper updates results presented in two previous ..."
Abstract - Add to MetaCart
This short paper updates results presented in two previous

Lambertian Reflectance and Linear Subspaces

by Ronen Basri, David Jacobs , 2000
"... We prove that the set of all reflectance functions (the mapping from surface normals to intensities) produced by Lambertian objects under distant, isotropic lighting lies close to a 9D linear subspace. This implies that, in general, the set of images of a convex Lambertian object obtained under a wi ..."
Abstract - Cited by 526 (20 self) - Add to MetaCart
We prove that the set of all reflectance functions (the mapping from surface normals to intensities) produced by Lambertian objects under distant, isotropic lighting lies close to a 9D linear subspace. This implies that, in general, the set of images of a convex Lambertian object obtained under a

Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection

by Peter N. Belhumeur, João P. Hespanha, David J. Kriegman , 1997
"... We develop a face recognition algorithm which is insensitive to gross variation in lighting direction and facial expression. Taking a pattern classification approach, we consider each pixel in an image as a coordinate in a high-dimensional space. We take advantage of the observation that the images ..."
Abstract - Cited by 2310 (17 self) - Add to MetaCart
from this linear subspace. Rather than explicitly modeling this deviation, we linearly project the image into a subspace in a manner which discounts those regions of the face with large deviation. Our projection method is based on Fisher's Linear Discriminant and produces well separated classes

Fisher Discriminant Analysis With Kernels

by Sebastian Mika, Gunnar Rätsch, Jason Weston, Bernhard Schölkopf, Klaus-Robert Müller , 1999
"... A non-linear classification technique based on Fisher's discriminant is proposed. The main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space. The linear classification in feature space corresponds to a (powerful) non-linear decision f ..."
Abstract - Cited by 503 (18 self) - Add to MetaCart
A non-linear classification technique based on Fisher's discriminant is proposed. The main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space. The linear classification in feature space corresponds to a (powerful) non-linear decision

Using Discriminant Eigenfeatures for Image Retrieval

by Daniel L. Swets, John Weng , 1996
"... This paper describes the automatic selection of features from an image training set using the theories of multi-dimensional linear discriminant analysis and the associated optimal linear projection. We demonstrate the effectiveness of these Most Discriminating Features for view-based class retrieval ..."
Abstract - Cited by 508 (15 self) - Add to MetaCart
This paper describes the automatic selection of features from an image training set using the theories of multi-dimensional linear discriminant analysis and the associated optimal linear projection. We demonstrate the effectiveness of these Most Discriminating Features for view-based class

GMRES: A generalized minimal residual algorithm for solving nonsymmetric linear systems

by Youcef Saad, Martin H. Schultz - SIAM J. SCI. STAT. COMPUT , 1986
"... We present an iterative method for solving linear systems, which has the property ofminimizing at every step the norm of the residual vector over a Krylov subspace. The algorithm is derived from the Arnoldi process for constructing an l2-orthogonal basis of Krylov subspaces. It can be considered a ..."
Abstract - Cited by 2076 (41 self) - Add to MetaCart
We present an iterative method for solving linear systems, which has the property ofminimizing at every step the norm of the residual vector over a Krylov subspace. The algorithm is derived from the Arnoldi process for constructing an l2-orthogonal basis of Krylov subspaces. It can be considered

Comparison of discrimination methods for the classification of tumors using gene expression data

by Sandrine Dudoit, Jane Fridlyand, Terence P. Speed - JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION , 2002
"... A reliable and precise classification of tumors is essential for successful diagnosis and treatment of cancer. cDNA microarrays and high-density oligonucleotide chips are novel biotechnologies increasingly used in cancer research. By allowing the monitoring of expression levels in cells for thousand ..."
Abstract - Cited by 770 (6 self) - Add to MetaCart
gene expression data is an important aspect of this novel approach to cancer classification. This article compares the performance of different discrimination methods for the classification of tumors based on gene expression data. The methods include nearest-neighbor classifiers, linear discriminant

Using Linear Algebra for Intelligent Information Retrieval

by Michael W. Berry, Susan T. Dumais - SIAM REVIEW , 1995
"... Currently, most approaches to retrieving textual materials from scientific databases depend on a lexical match between words in users' requests and those in or assigned to documents in a database. Because of the tremendous diversity in the words people use to describe the same document, lexical ..."
Abstract - Cited by 676 (18 self) - Add to MetaCart
by 200-300 of the largest singular vectors are then matched against user queries. We call this retrieval method Latent Semantic Indexing (LSI) because the subspace represents important associative relationships between terms and documents that are not evident in individual documents. LSI is a completely
Next 10 →
Results 1 - 10 of 10,210
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University