## Curvilinear Distance Analysis versus Isomap (2000)

Venue: | Proceedings of ESANN’2002, 10th European Symposium on Artificial Neural Networks |

Citations: | 26 - 11 self |

### BibTeX

@INPROCEEDINGS{Lee00curvilineardistance,

author = {John Aldo Lee and Amaury Lendasse and Michel Verleysen},

title = {Curvilinear Distance Analysis versus Isomap},

booktitle = {Proceedings of ESANN’2002, 10th European Symposium on Artificial Neural Networks},

year = {2000},

pages = {185--192}

}

### Years of Citing Articles

### OpenURL

### Abstract

Dimension reduction techniques are widely used for the analysis and visualization of complex sets of data. This paper compares two nonlinear projection methods: Isomap and Curvilinear Distance Analysis.

### Citations

3294 |
Self-Organizing Maps
- Kohonen
- 2000
(Show Context)
Citation Context ...ods like the Principal Component Analysis (PCA, [8]) or the original metric multidimensional scaling (MDS, [14]). In the second class are nonlinear algorithms like Kohonen’s Self-Organizing Map (SOM=-=, [9, 10]-=-) or nonlinear variants of the MDS. Contrarily to the linear PCA, the last ones do not use a criterion based on variance preservation. Instead, they try to reproduce in the projection space the pairwi... |

1709 |
A global geometric framework for nonlinear dimensionality reduction
- Tenenbaum, Silva, et al.
- 2000
(Show Context)
Citation Context ...n space the pairwise distances measured in the data space. This paper compares two of these nonlinear projection methods that derive more or less directly from the MDS. The first one is called Isomap =-=[13] a-=-nd is described in Section 2. Isomap differs from the linear MDS by the innovative metrics [2] used to measure the pairwise distances in the data. The second method is ∗ This work was realized with ... |

830 |
A note on two problems in connection with graphs
- Dijkstra
- 1959
(Show Context)
Citation Context ... the geodesic distance between two points is approximated by the sum of the arc lengths along the shortest path linking both points. Practically, the shortest path is computed by Dijkstra’s algorith=-=m [4]-=-.sESANN'2002 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 24-26 April 2002, d-side publi., ISBN 2-930307-02-1, pp. 185-192 From a technical point of view, Isomap pr... |

262 |
Measuring the strangeness of strange attractors
- Grassberger, Procaccia
- 1983
(Show Context)
Citation Context ...S encounters difficulties when projecting nonlinear structures like the spiral illustrated in Fig. 1a. Actually, the spiral is embedded in a two-dimensional space, but clearly its intrinsic dimension =-=[5, 6]-=- does not exceed one: only one parameter suffices to describe the spiral. Unfortunately, the projection from two dimensions to only one dimension is not easy because the spiral needs to be unrolled on... |

152 | Curvilinear component analysis: A selforganizing neural network for nonlinear mapping of data sets
- Demartines, Hérault
- 1997
(Show Context)
Citation Context ...dimensional space to the p-dimensional space is achieved algebraically by the traditional MDS, while CDA works with neural methods. Actually, CDA derives from the Curvilinear Component Analysis (CCA, =-=[3, 7]) -=-and from Sammon’s Nonlinear Mapping (NLM, [12]). Those two techniques act by preserving distances, like the MDS, but they proceed with an energy function which is minimized by gradient descent (NLM)... |

127 |
Competitive learning algorithms for vector quantization
- Ahalt, Krishnamurthy, et al.
- 1990
(Show Context)
Citation Context ...DA procedure:sESANN'2002 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 24-26 April 2002, d-side publi., ISBN 2-930307-02-1, pp. 185-192 1. apply vector quantization =-=[1] on-=- the raw data (this optional step yields prototypes which are very similar to the landmarks points of Isomap); 2. compute the k- or ɛ-neighborhoods and link neighboring prototypes; 3. run Dijkstra’... |

124 |
Principal Component Analysis
- Jollife
- 2002
(Show Context)
Citation Context ...llow the user to better analyze or visualize complex data sets. These techniques may be distinguished into two classes. In the first one are linear methods like the Principal Component Analysis (PCA, =-=[8]) -=-or the original metric multidimensional scaling (MDS, [14]). In the second class are nonlinear algorithms like Kohonen’s Self-Organizing Map (SOM, [9, 10]) or nonlinear variants of the MDS. Contrari... |

97 | Graph approximations to geodesics on embedded manifolds, manuscript
- Bernstein, Silva, et al.
- 2000
(Show Context)
Citation Context ...near projection methods that derive more or less directly from the MDS. The first one is called Isomap [13] and is described in Section 2. Isomap differs from the linear MDS by the innovative metrics =-=[2] used to mea-=-sure the pairwise distances in the data. The second method is ∗ This work was realized with the support of the ‘Ministère de la Région wallonne’, under the ‘Programme de Formation et d’Imp... |

32 |
A nonlinear Mapping Algorithm for Data Structure Analysis
- Sammon
- 1969
(Show Context)
Citation Context ...hieved algebraically by the traditional MDS, while CDA works with neural methods. Actually, CDA derives from the Curvilinear Component Analysis (CCA, [3, 7]) and from Sammon’s Nonlinear Mapping (NLM=-=, [12]-=-). Those two techniques act by preserving distances, like the MDS, but they proceed with an energy function which is minimized by gradient descent (NLM) or by stochastic gradient descent (CCA). Formal... |

23 | A robust nonlinear projection method
- Lee, Lendasse, et al.
- 2000
(Show Context)
Citation Context ...SANN'2002 proceedings - European Symposium on Artificial Neural Networks Bruges (Belgium), 24-26 April 2002, d-side publi., ISBN 2-930307-02-1, pp. 185-192 called Curvilinear Distances Analysis (CDA, =-=[11]-=-, see Section 3) and shares the same metrics as Isomap. Both methods have been developed simultaneously and independently. The comparison (Section 4) highlights the theoretical and practical differenc... |

13 |
Self-organization of topologically correct feature maps
- Kohonen
- 1982
(Show Context)
Citation Context ...ods like the Principal Component Analysis (PCA, [8]) or the original metric multidimensional scaling (MDS, [14]). In the second class are nonlinear algorithms like Kohonen’s Self-Organizing Map (SOM=-=, [9, 10]-=-) or nonlinear variants of the MDS. Contrarily to the linear PCA, the last ones do not use a criterion based on variance preservation. Instead, they try to reproduce in the projection space the pairwi... |

10 |
Curvilinear Component Analysis for high dimensional data representation : I. Theoretical aspects and practical use in the presence of noise
- Hérault, Jausions-Picaud, et al.
- 1999
(Show Context)
Citation Context ...dimensional space to the p-dimensional space is achieved algebraically by the traditional MDS, while CDA works with neural methods. Actually, CDA derives from the Curvilinear Component Analysis (CCA, =-=[3, 7]) -=-and from Sammon’s Nonlinear Mapping (NLM, [12]). Those two techniques act by preserving distances, like the MDS, but they proceed with an energy function which is minimized by gradient descent (NLM)... |

6 |
Intrinsic dimensionality extraction, in
- Fukunaga
- 1982
(Show Context)
Citation Context ...S encounters difficulties when projecting nonlinear structures like the spiral illustrated in Fig. 1a. Actually, the spiral is embedded in a two-dimensional space, but clearly its intrinsic dimension =-=[5, 6]-=- does not exceed one: only one parameter suffices to describe the spiral. Unfortunately, the projection from two dimensions to only one dimension is not easy because the spiral needs to be unrolled on... |