Results 1  10
of
31
Learning over Sets using Kernel Principal Angles
 Journal of Machine Learning Research
, 2003
"... We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f (A,B) defined over pairs of matrices A,B based on the concept of principal angles between two linear subspaces. We show that the principal angles can be recovered ..."
Abstract

Cited by 79 (2 self)
 Add to MetaCart
We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f (A,B) defined over pairs of matrices A,B based on the concept of principal angles between two linear subspaces. We show that the principal angles can be recovered using only innerproducts between pairs of column vectors of the input matrices thereby allowing the original column vectors of A,B to be mapped onto arbitrarily highdimensional feature spaces.
Discriminative Learning and Recognition of Image Set Classes Using Canonical Correlations
 IEEE Trans. Pattern Analysis and Machine Intelligence
, 2007
"... Abstract—We address the problem of comparing sets of images for object recognition, where the sets may represent variations in an object’s appearance due to changing camera pose and lighting conditions. Canonical Correlations (also known as principal or canonical angles), which can be thought of as ..."
Abstract

Cited by 49 (10 self)
 Add to MetaCart
Abstract—We address the problem of comparing sets of images for object recognition, where the sets may represent variations in an object’s appearance due to changing camera pose and lighting conditions. Canonical Correlations (also known as principal or canonical angles), which can be thought of as the angles between two ddimensional subspaces, have recently attracted attention for image set matching. Canonical correlations offer many benefits in accuracy, efficiency, and robustness compared to the two main classical methods: parametric distributionbased and nonparametric samplebased matching of sets. Here, this is first demonstrated experimentally for reasonably sized data sets using existing methods exploiting canonical correlations. Motivated by their proven effectiveness, a novel discriminative learning method over sets is proposed for set classification. Specifically, inspired by classical Linear Discriminant Analysis (LDA), we develop a linear discriminant function that maximizes the canonical correlations of withinclass sets and minimizes the canonical correlations of betweenclass sets. Image sets transformed by the discriminant function are then compared by the canonical correlations. Classical orthogonal subspace method (OSM) is also investigated for the similar purpose and compared with the proposed method. The proposed method is evaluated on various object recognition problems using face image sets with arbitrary motion captured under different illuminations and image sets of 500 general objects taken at different views. The method is also applied to object category recognition using ETH80 database. The proposed method is shown to outperform the stateoftheart methods in terms of accuracy and efficiency. Index Terms—Object recognition, face recognition, image sets, canonical correlation, principal angles, canonical correlation analysis, linear discriminant analysis, orthogonal subspace method. Ç 1
Kernel Principal Angles for Classification Machines with Applications to Image Sequence Interpretation
, 2002
"... We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f(A# B) defined over pairs of matrices A# B based on the concept of principal angles between two linear subspaces. We show that the principal angles can be recovered ..."
Abstract

Cited by 37 (6 self)
 Add to MetaCart
We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f(A# B) defined over pairs of matrices A# B based on the concept of principal angles between two linear subspaces. We show that the principal angles can be recovered using only innerproducts between pairs of column vectors of the input matrices thereby allowing the original column vectors of A# B to be mapped onto arbitrarily highdimensional feature spaces.
The analysis of vegetationenvironment relationships by canonical correspondence analysis
, 1987
"... Canonical correspondence analysis (CCA) is introduced as a multivariate extension of weighted averaging ordination, which is a simple method for arranging species along environmental variables. CCA constructs those linear combinations of environmental variables, along which the distributions of the ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
Canonical correspondence analysis (CCA) is introduced as a multivariate extension of weighted averaging ordination, which is a simple method for arranging species along environmental variables. CCA constructs those linear combinations of environmental variables, along which the distributions of the species are maximally separated. The eigenvalues produced by CCA measure this separation. As its name suggests, CCA is also a correspondence analysis technique, but one in which the ordination axes are constrained to be linear combinations of environmental variables. The ordination diagram generated by CCA visualizes not only a pattern of community variation (as in standard ordination) but also the main features of the distributions of species along the environmental variables. Applications demonstrate that CCA can be used both for detecting speciesenvironment relations, and for investigating specific questions about the response of species to environmental variables. Questions in community ecology that have typically been studied by 'indirect ' gradient analysis (i.e. ordination followed by external interpretation of the axes) can now be answered more directly by CCA.
The Gifi System Of Descriptive Multivariate Analysis
 STATISTICAL SCIENCE
, 1998
"... The Gifi system of analyzing categorical data through nonlinear varieties of classical multivariate analysis techniques is reviewed. The system is characterized by the optimal scaling of categorical variables which is implemented through alternating least squares algorithms. The main technique of h ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
The Gifi system of analyzing categorical data through nonlinear varieties of classical multivariate analysis techniques is reviewed. The system is characterized by the optimal scaling of categorical variables which is implemented through alternating least squares algorithms. The main technique of homogeneity analysis is presented, along with its extensions and generalizations leading to nonmetric principal components analysis and canonical correlation analysis. A brief account of stability issues and areas of applications of the techniques is also given.
The Geometry of Kernel Canonical Correlation Analysis
, 2003
"... Canonical correlation analysis (CCA) is a classical multivariate method concerned with describing linear dependencies between sets of variables. After a short exposition of the linear sample CCA problem and its analytical solution, the article proceeds with a detailed characterization of its geometr ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Canonical correlation analysis (CCA) is a classical multivariate method concerned with describing linear dependencies between sets of variables. After a short exposition of the linear sample CCA problem and its analytical solution, the article proceeds with a detailed characterization of its geometry. Projection operators are used to illustrate the relations between canonical vectors and variates. The article then addresses the problem of CCA between spaces spanned by objects mapped into kernel feature spaces. An exact solution for this kernel canonical correlation (KCCA) problem is derived from a geometric point of view. It shows that the expansion coefficients of the canonical vectors in their respective feature space can be found by linear CCA in the basis induced by kernel principal component analysis. The effect of mappings into higher dimensional feature spaces is considered critically since it simplifies the CCA problem in general. Then two regularized variants of KCCA are discussed. Relations to other methods are illustrated, e.g., multicategory kernel Fisher discriminant analysis, kernel principal component regression and possible applications thereof in blind source separation.
Boosted manifold principal angles for image setbased recognition, Pattern Recognition 40
, 2007
"... www.elsevier.com/locate/pr ..."
Learning over sets using Boosted Manifold Principal Angles (BoMPA)
 Proc. British Machine Vision Conference
, 2005
"... In this paper we address the problem of classifying vector sets. We motivate and introduce a novel method based on comparisons between corresponding vector subspaces. In particular, there are two main areas of novelty: (i) we extend the concept of principal angles between linear subspaces to manifol ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
In this paper we address the problem of classifying vector sets. We motivate and introduce a novel method based on comparisons between corresponding vector subspaces. In particular, there are two main areas of novelty: (i) we extend the concept of principal angles between linear subspaces to manifolds with arbitrary nonlinearities; (ii) it is demonstrated how boosting can be used for applicationoptimal principal angle fusion. The strengths of the proposed method are empirically demonstrated on the task of automatic face recognition (AFR), in which it is shown to outperform stateoftheart methods in the literature.
Subspace Angles between ARMA Models
, 2002
"... We define a notion of subspace angles between two linear, autoregressive moving average (ARMA), single input single output (SISO) models by considering the principal angles between subspaces that are derived from these models. We show how a recently defined metric for these models, which is based on ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
We define a notion of subspace angles between two linear, autoregressive moving average (ARMA), single input single output (SISO) models by considering the principal angles between subspaces that are derived from these models. We show how a recently defined metric for these models, which is based on their cepstra, relates to the subspace angles between the models.