Results 1  10
of
25
Neighbourhood components analysis
 Advances in Neural Information Processing Systems 17
, 2004
"... In this paper we propose a novel method for learning a Mahalanobis distance measure to be used in the KNN classification algorithm. The algorithm directly maximizes a stochastic variant of the leaveoneout KNN score on the training set. It can also learn a lowdimensional linear embedding of labele ..."
Abstract

Cited by 325 (9 self)
 Add to MetaCart
(Show Context)
In this paper we propose a novel method for learning a Mahalanobis distance measure to be used in the KNN classification algorithm. The algorithm directly maximizes a stochastic variant of the leaveoneout KNN score on the training set. It can also learn a lowdimensional linear embedding of labeled data that can be used for data visualization and fast classification. Unlike other methods, our classification model is nonparametric, making no assumptions about the shape of the class distributions or the boundaries between them. The performance of the method is demonstrated on several data sets, both for metric learning and linear dimensionality reduction. 1
Rolling the Dice: Multidimensional Visual Exploration using Scatterplot Matrix Navigation Visualization
 Issue 6, Nov.Dec. 2008 Page(s):1539
"... Abstract—Scatterplots remain one of the most popular and widelyused visual representations for multidimensional data due to their simplicity, familiarity and visual clarity, even if they lack some of the flexibility and visual expressiveness of newer multidimensional visualization techniques. This ..."
Abstract

Cited by 109 (15 self)
 Add to MetaCart
(Show Context)
Abstract—Scatterplots remain one of the most popular and widelyused visual representations for multidimensional data due to their simplicity, familiarity and visual clarity, even if they lack some of the flexibility and visual expressiveness of newer multidimensional visualization techniques. This paper presents new interactive methods to explore multidimensional data using scatterplots. This exploration is performed using a matrix of scatterplots that gives an overview of the possible configurations, thumbnails of the scatterplots, and support for interactive navigation in the multidimensional space. Transitions between scatterplots are performed as animated rotations in 3D space, somewhat akin to rolling dice. Users can iteratively build queries using bounding volumes in the dataset, sculpting the query from different viewpoints to become more and more refined. Furthermore, the dimensions in the navigation space can be reordered, manually or automatically, to highlight salient correlations and differences among them. An example scenario presents the interaction techniques supporting smooth and effortless visual exploration of multidimensional datasets. Index Terms—Visual exploration, visual queries, visual analytics, navigation, multivariate data, interaction. 1
Graph Drawing by HighDimensional Embedding
 In GD02, LNCS
, 2002
"... We present a novel approach to the aesthetic drawing of undirected graphs. The method has two phases: first embed the graph in a very high dimension and then project it into the 2D plane using PCA. Experiments we have carried out show the ability of the method to draw graphs of 10 nodes in few seco ..."
Abstract

Cited by 71 (10 self)
 Add to MetaCart
(Show Context)
We present a novel approach to the aesthetic drawing of undirected graphs. The method has two phases: first embed the graph in a very high dimension and then project it into the 2D plane using PCA. Experiments we have carried out show the ability of the method to draw graphs of 10 nodes in few seconds. The new method appears to have several advantages over classical methods, including a significantly better running time, a useful inherent capability to exhibit the graph in various dimensions, and an effective means for interactive exploration of large graphs.
Drawing graphs by eigenvectors: Theory and practice
 Computers and Mathematics with Applications
, 2005
"... Abstract. The spectral approach for graph visualization computes the layout of a graph using certain eigenvectors of related matrices. Some important advantages of this approach are an ability to compute optimal layouts (according to specific requirements) and a very rapid computation time. In this ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The spectral approach for graph visualization computes the layout of a graph using certain eigenvectors of related matrices. Some important advantages of this approach are an ability to compute optimal layouts (according to specific requirements) and a very rapid computation time. In this paper we explore spectral visualization techniques and study their properties from different points of view. We also suggest a novel algorithm for calculating spectral layouts resulting in an extremely fast computation by optimizing the layout within a small vector space.
Uncorrelated multilinear discriminant analysis with regularization and aggregation for tensor object recognition
 IEEE Trans. Neural Netw
, 2009
"... This paper proposes a novel uncorrelated multilinear discriminant analysis (UMLDA) algorithm for the challenging problem of gait recognition. A tensortovector projection (TVP) of tensor objects is formulated and the UMLDA is developed using TVP to extract uncorrelated discriminative features direc ..."
Abstract

Cited by 19 (12 self)
 Add to MetaCart
(Show Context)
This paper proposes a novel uncorrelated multilinear discriminant analysis (UMLDA) algorithm for the challenging problem of gait recognition. A tensortovector projection (TVP) of tensor objects is formulated and the UMLDA is developed using TVP to extract uncorrelated discriminative features directly from tensorial data. The smallsamplesize (SSS) problem present when discriminant solutions are applied to the problem of gait recognition is discussed and a regularization procedure is introduced to address it. The effectiveness of the proposed regularization is demonstrated in the experiments and the regularized UMLDA algorithm is shown to outperform other multilinear subspace solutions in gait recognition. 1.
Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization
"... Tensorial data are frequently encountered in various machine learning tasks today and dimensionality reduction is one of their most important applications. This paper extends the classical principal component analysis (PCA) to its multilinear version by proposing a novel unsupervised dimensionality ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
(Show Context)
Tensorial data are frequently encountered in various machine learning tasks today and dimensionality reduction is one of their most important applications. This paper extends the classical principal component analysis (PCA) to its multilinear version by proposing a novel unsupervised dimensionality reduction algorithm for tensorial data, named as uncorrelated multilinear PCA (UMPCA). UMPCA seeks a tensortovector projection that captures most of the variation in the original tensorial input while producing uncorrelated features through successive variance maximization. We evaluate the UMPCA on a secondorder tensorial problem, face recognition, and the experimental results show its superiority, especially in lowdimensional spaces, through the comparison with three other PCAbased algorithms. 1.
Principal graphs and manifolds
 in “Handbook of Research on Machine Learning Applications and Trends: Algorithms, Methods and Techniques
"... In many physical statistical, biological and other investigations it is desirable to approximate a system of points by objects of lower dimension and/or complexity. For this purpose, Karl Pearson invented principal component analysis in 1901 and found ‘lines and planes of closest fit to system of po ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
In many physical statistical, biological and other investigations it is desirable to approximate a system of points by objects of lower dimension and/or complexity. For this purpose, Karl Pearson invented principal component analysis in 1901 and found ‘lines and planes of closest fit to system of points’. The famous kmeans algorithm solves the approximation problem too, but by finite sets instead of lines and planes. This chapter gives a brief practical introduction into the methods of construction of general principal objects, i.e. objects embedded in the ‘middle ’ of the multidimensional data set. As a basis, the unifying framework of mean squared distance approximation of finite datasets is selected. Principal graphs and manifolds are constructed as generalisations of principal components and kmeans principal points. For this purpose, the family of expectation/maximisation algorithms with nearest generalisations is presented. Construction of principal graphs with controlled complexity is based on the graph grammar approach.
Graph Drawing by Classical Multidimensional Scaling: New Perspectives
"... Abstract. With shortestpath distances as input, classical multidimensional scaling can be regarded as a spectral graph drawing algorithm, and recent approximation techniques make it scale to very large graphs. In comparison with other methods, however, it is considered inflexible and prone to dege ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. With shortestpath distances as input, classical multidimensional scaling can be regarded as a spectral graph drawing algorithm, and recent approximation techniques make it scale to very large graphs. In comparison with other methods, however, it is considered inflexible and prone to degenerate layouts for some classes of graphs. We want to challenge this belief by demonstrating that the method can be flexibly adapted to provide focus+context layouts. Moreover, we propose an alternative instantiation that appears to be more suitable for graph drawing and prevents certain degeneracies. 1
A Survey of Dimension Reduction Methods for Highdimensional Data Analysis and Visualization ∗
"... Dimension reduction is commonly defined as the process of mapping highdimensional data to a lowerdimensional embedding. Applications of dimension reduction include, but are not limited to, filtering, compression, regression, classification, feature analysis, and visualization. We review methods th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Dimension reduction is commonly defined as the process of mapping highdimensional data to a lowerdimensional embedding. Applications of dimension reduction include, but are not limited to, filtering, compression, regression, classification, feature analysis, and visualization. We review methods that compute a pointbased visual representation of highdimensional data sets to aid in exploratory data analysis. The aim is not to be exhaustive but to provide an overview of basic approaches, as well as to review select stateoftheart methods. Our survey paper is an introduction to dimension reduction from a visualization point of view. Subsequently, a comparison of stateoftheart methods outlines relations and shared research foci.