Results 1  10
of
21
Neighbourhood components analysis
 Advances in Neural Information Processing Systems 17
, 2004
"... In this paper we propose a novel method for learning a Mahalanobis distance measure to be used in the KNN classification algorithm. The algorithm directly maximizes a stochastic variant of the leaveoneout KNN score on the training set. It can also learn a lowdimensional linear embedding of labele ..."
Abstract

Cited by 225 (8 self)
 Add to MetaCart
(Show Context)
In this paper we propose a novel method for learning a Mahalanobis distance measure to be used in the KNN classification algorithm. The algorithm directly maximizes a stochastic variant of the leaveoneout KNN score on the training set. It can also learn a lowdimensional linear embedding of labeled data that can be used for data visualization and fast classification. Unlike other methods, our classification model is nonparametric, making no assumptions about the shape of the class distributions or the boundaries between them. The performance of the method is demonstrated on several data sets, both for metric learning and linear dimensionality reduction. 1
Graph Drawing by HighDimensional Embedding
 In GD02, LNCS
, 2002
"... We present a novel approach to the aesthetic drawing of undirected graphs. The method has two phases: first embed the graph in a very high dimension and then project it into the 2D plane using PCA. Experiments we have carried out show the ability of the method to draw graphs of 10 nodes in few seco ..."
Abstract

Cited by 64 (10 self)
 Add to MetaCart
(Show Context)
We present a novel approach to the aesthetic drawing of undirected graphs. The method has two phases: first embed the graph in a very high dimension and then project it into the 2D plane using PCA. Experiments we have carried out show the ability of the method to draw graphs of 10 nodes in few seconds. The new method appears to have several advantages over classical methods, including a significantly better running time, a useful inherent capability to exhibit the graph in various dimensions, and an effective means for interactive exploration of large graphs.
Rolling the Dice: Multidimensional Visual Exploration using Scatterplot Matrix Navigation Visualization
 Issue 6, Nov.Dec. 2008 Page(s):1539
"... Abstract—Scatterplots remain one of the most popular and widelyused visual representations for multidimensional data due to their simplicity, familiarity and visual clarity, even if they lack some of the flexibility and visual expressiveness of newer multidimensional visualization techniques. This ..."
Abstract

Cited by 60 (6 self)
 Add to MetaCart
(Show Context)
Abstract—Scatterplots remain one of the most popular and widelyused visual representations for multidimensional data due to their simplicity, familiarity and visual clarity, even if they lack some of the flexibility and visual expressiveness of newer multidimensional visualization techniques. This paper presents new interactive methods to explore multidimensional data using scatterplots. This exploration is performed using a matrix of scatterplots that gives an overview of the possible configurations, thumbnails of the scatterplots, and support for interactive navigation in the multidimensional space. Transitions between scatterplots are performed as animated rotations in 3D space, somewhat akin to rolling dice. Users can iteratively build queries using bounding volumes in the dataset, sculpting the query from different viewpoints to become more and more refined. Furthermore, the dimensions in the navigation space can be reordered, manually or automatically, to highlight salient correlations and differences among them. An example scenario presents the interaction techniques supporting smooth and effortless visual exploration of multidimensional datasets. Index Terms—Visual exploration, visual queries, visual analytics, navigation, multivariate data, interaction. 1
Drawing graphs by eigenvectors: Theory and practice
 Computers and Mathematics with Applications
, 2005
"... Abstract. The spectral approach for graph visualization computes the layout of a graph using certain eigenvectors of related matrices. Some important advantages of this approach are an ability to compute optimal layouts (according to specific requirements) and a very rapid computation time. In this ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The spectral approach for graph visualization computes the layout of a graph using certain eigenvectors of related matrices. Some important advantages of this approach are an ability to compute optimal layouts (according to specific requirements) and a very rapid computation time. In this paper we explore spectral visualization techniques and study their properties from different points of view. We also suggest a novel algorithm for calculating spectral layouts resulting in an extremely fast computation by optimizing the layout within a small vector space.
Uncorrelated multilinear discriminant analysis with regularization and aggregation for tensor object recognition
 IEEE Trans. Neural Netw
, 2009
"... This paper proposes a novel uncorrelated multilinear discriminant analysis (UMLDA) algorithm for the challenging problem of gait recognition. A tensortovector projection (TVP) of tensor objects is formulated and the UMLDA is developed using TVP to extract uncorrelated discriminative features direc ..."
Abstract

Cited by 15 (11 self)
 Add to MetaCart
(Show Context)
This paper proposes a novel uncorrelated multilinear discriminant analysis (UMLDA) algorithm for the challenging problem of gait recognition. A tensortovector projection (TVP) of tensor objects is formulated and the UMLDA is developed using TVP to extract uncorrelated discriminative features directly from tensorial data. The smallsamplesize (SSS) problem present when discriminant solutions are applied to the problem of gait recognition is discussed and a regularization procedure is introduced to address it. The effectiveness of the proposed regularization is demonstrated in the experiments and the regularized UMLDA algorithm is shown to outperform other multilinear subspace solutions in gait recognition. 1.
Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization
"... Tensorial data are frequently encountered in various machine learning tasks today and dimensionality reduction is one of their most important applications. This paper extends the classical principal component analysis (PCA) to its multilinear version by proposing a novel unsupervised dimensionality ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
(Show Context)
Tensorial data are frequently encountered in various machine learning tasks today and dimensionality reduction is one of their most important applications. This paper extends the classical principal component analysis (PCA) to its multilinear version by proposing a novel unsupervised dimensionality reduction algorithm for tensorial data, named as uncorrelated multilinear PCA (UMPCA). UMPCA seeks a tensortovector projection that captures most of the variation in the original tensorial input while producing uncorrelated features through successive variance maximization. We evaluate the UMPCA on a secondorder tensorial problem, face recognition, and the experimental results show its superiority, especially in lowdimensional spaces, through the comparison with three other PCAbased algorithms. 1.
Principal graphs and manifolds
 in “Handbook of Research on Machine Learning Applications and Trends: Algorithms, Methods and Techniques
"... In many physical statistical, biological and other investigations it is desirable to approximate a system of points by objects of lower dimension and/or complexity. For this purpose, Karl Pearson invented principal component analysis in 1901 and found ‘lines and planes of closest fit to system of po ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
In many physical statistical, biological and other investigations it is desirable to approximate a system of points by objects of lower dimension and/or complexity. For this purpose, Karl Pearson invented principal component analysis in 1901 and found ‘lines and planes of closest fit to system of points’. The famous kmeans algorithm solves the approximation problem too, but by finite sets instead of lines and planes. This chapter gives a brief practical introduction into the methods of construction of general principal objects, i.e. objects embedded in the ‘middle ’ of the multidimensional data set. As a basis, the unifying framework of mean squared distance approximation of finite datasets is selected. Principal graphs and manifolds are constructed as generalisations of principal components and kmeans principal points. For this purpose, the family of expectation/maximisation algorithms with nearest generalisations is presented. Construction of principal graphs with controlled complexity is based on the graph grammar approach.
Gender Classification based on Facial Surface Normals
"... In this paper, we perform gender classification based on 2.5D facial surface normals (facial needlemaps), and present two novel principal geodesic analysis (PGA) methods, weighted PGA and supervised PGA, to parameterize the facial needlemaps, and compare their performances with PGA for gender clas ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, we perform gender classification based on 2.5D facial surface normals (facial needlemaps), and present two novel principal geodesic analysis (PGA) methods, weighted PGA and supervised PGA, to parameterize the facial needlemaps, and compare their performances with PGA for gender classification. Experimental results demonstrate the feasibility of gender classification based on facial needlemaps, and show that incorporating weights or pairwise relationships of labeled data into PGA improves the gender discriminating powers in the leading eigenvectors and the gender classification accuracy. 1.
© 2006 Science Publications Splitting Technique Initialization in Local PCA
"... Abstract: The local Principal Component Analysis (PCA) reduces linearly redundant components that may present in higher dimensional space. It deploys an initial guess technique which can be utilized when the distribution of a given multivariate data is known to the user. The problem in initializatio ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: The local Principal Component Analysis (PCA) reduces linearly redundant components that may present in higher dimensional space. It deploys an initial guess technique which can be utilized when the distribution of a given multivariate data is known to the user. The problem in initialization arises when the distribution is not known. This study explores a technique that can be easily integrated in the local PCA design and is efficient even when the given statistical distribution is unknown. The initialization using this proposed splitting technique not only splits and reproduces the mean vector but also the orientation of components in the subspace domain. This would ensure that all clusters are used in the design. The proposed integration with the reconstruction distance local PCA design enables easier data processing and more accurate representation of multivariate data. A comparative approach is undertaken to demonstrate the greater effectiveness of the proposed approach in terms of percentage error.