## Visualization of Labeled Data Using Linear Transformations

### Cached

### Download Links

- [www.wisdom.weizmann.ac.il]
- [wwwisg.cs.uni-magdeburg.de]
- [www.research.att.com]
- [www.research.att.com]
- DBLP

### Other Repositories/Bibliography

Citations: | 15 - 1 self |

### BibTeX

@MISC{Koren_visualizationof,

author = {Yehuda Koren and Liran Carmel},

title = {Visualization of Labeled Data Using Linear Transformations},

year = {}

}

### OpenURL

### Abstract

We present a novel family of data-driven linear transformations, aimed at visualizing multivariate data in a low-dimensional space in a way that optimally preserves the structure of the data. The well-studied PCA and Fisher's LDA are shown to be special members in this family of transformations, and we demonstrate how to generalize these two methods such as to enhance their performance. Furthermore, our technique is the only one, to the best of our knowledge, that reflects in the resulting embedding both the data coordinates and pairwise similarities and/or dissimilarities between the data elements. Even more so, when information on the clustering (labeling) decomposition of the data is known, this information can be integrated in the linear transformation, resulting in embeddings that clearly show the separation between the clusters, as well as their intra-structure. All this make our technique very flexible and powerful, and let us cope with kinds of data that other techniques fail to describe properly.

### Citations

279 |
Statistical pattern recognition
- Webb
- 2002
(Show Context)
Citation Context ...s in a low-dimensional space (mostly 2-D or 3-D) in a way that captures certain structured components of the data. There are numerous techniques in this family, including principal component analysis =-=[4, 11]-=-, multidimensional scaling [10], eigenprojection [5, 7, 8], and force-directed placement [3, 6, 2, 9]. We are particularly interested in the sub-family of methods that use linear transformations to ma... |

159 |
An r-dimensional quadratic placement algorithm
- Hall
- 1970
(Show Context)
Citation Context ...y that captures certain structured components of the data. There are numerous techniques in this family, including principal component analysis [4, 11], multidimensional scaling [10], eigenprojection =-=[5, 7, 8]-=-, and force-directed placement [3, 6, 2, 9]. We are particularly interested in the sub-family of methods that use linear transformations to map the high-dimensional data into a low-dimensional space. ... |

94 |
Applied Multivariate Data Analysis
- Everitt, Dunn
- 1991
(Show Context)
Citation Context ...s in a low-dimensional space (mostly 2-D or 3-D) in a way that captures certain structured components of the data. There are numerous techniques in this family, including principal component analysis =-=[4, 11]-=-, multidimensional scaling [10], eigenprojection [5, 7, 8], and force-directed placement [3, 6, 2, 9]. We are particularly interested in the sub-family of methods that use linear transformations to ma... |

62 | Ace: A fast multiscale eigenvectors computation for drawing huge graphs
- Koren, Harel
- 2002
(Show Context)
Citation Context ...y that captures certain structured components of the data. There are numerous techniques in this family, including principal component analysis [4, 11], multidimensional scaling [10], eigenprojection =-=[5, 7, 8]-=-, and force-directed placement [3, 6, 2, 9]. We are particularly interested in the sub-family of methods that use linear transformations to map the high-dimensional data into a low-dimensional space. ... |

45 |
Introduction to multidimensional scaling. Theory, methods, and applications
- Schiffman, Reynolds, et al.
- 1981
(Show Context)
Citation Context ...ly 2-D or 3-D) in a way that captures certain structured components of the data. There are numerous techniques in this family, including principal component analysis [4, 11], multidimensional scaling =-=[10]-=-, eigenprojection [5, 7, 8], and force-directed placement [3, 6, 2, 9]. We are particularly interested in the sub-family of methods that use linear transformations to map the high-dimensional data int... |

43 | On spectral graph drawing
- Koren
- 2003
(Show Context)
Citation Context ...y that captures certain structured components of the data. There are numerous techniques in this family, including principal component analysis [4, 11], multidimensional scaling [10], eigenprojection =-=[5, 7, 8]-=-, and force-directed placement [3, 6, 2, 9]. We are particularly interested in the sub-family of methods that use linear transformations to map the high-dimensional data into a low-dimensional space. ... |

37 | Drawing Graphs - Kaufmann, Wagner - 2001 |

26 | M (2002) A Hybrid Layout Algorithm for Subâ€“Quadratic Multidimensional Scaling
- Morrison, Ross, et al.
- 1995
(Show Context)
Citation Context ...nents of the data. There are numerous techniques in this family, including principal component analysis [4, 11], multidimensional scaling [10], eigenprojection [5, 7, 8], and force-directed placement =-=[3, 6, 2, 9]-=-. We are particularly interested in the sub-family of methods that use linear transformations to map the high-dimensional data into a low-dimensional space. This way, each low-dimensional axis is some... |

11 | Visualizing and Classifying Odors Using a Similarity Matrix
- Carmel, Koren, et al.
- 2003
(Show Context)
Citation Context ... resulting in a 16-D vector representing that sample. In total, we have performed 300 measurements to yield a dataset of 300 elements in 16-D that are partitioned into 30 clusters. In a separate work =-=[1]-=-, we have developed a technique to derive from the raw data pairwise similarity values between any two samples. In 5(a) we show a 2-D embedding of this dataset using our method, where inter-cluster si... |

8 |
Cluster Stability and the Use of Noise
- Davidson, Wylie, et al.
- 2001
(Show Context)
Citation Context ...nents of the data. There are numerous techniques in this family, including principal component analysis [4, 11], multidimensional scaling [10], eigenprojection [5, 7, 8], and force-directed placement =-=[3, 6, 2, 9]-=-. We are particularly interested in the sub-family of methods that use linear transformations to map the high-dimensional data into a low-dimensional space. This way, each low-dimensional axis is some... |