## Generalized Non-metric Multidimensional Scaling

### Cached

### Download Links

- [cosmal.ucsd.edu]
- [grail.cs.washington.edu]
- [people.kyb.tuebingen.mpg.de]
- [www.cs.washington.edu]
- [vision.ucsd.edu]
- [vision.ucsd.edu]
- [jmlr.csail.mit.edu]
- [www.cs.ucsd.edu]
- [www.cs.ucsd.edu]
- [grail.cs.washington.edu]
- [homes.cs.washington.edu]
- [jmlr.org]
- [www.lcayton.com]
- [cseweb.ucsd.edu]
- [cseweb.ucsd.edu]

Citations: | 16 - 7 self |

### BibTeX

@MISC{Agarwal_generalizednon-metric,

author = {Sameer Agarwal and Gert Lanckriet},

title = {Generalized Non-metric Multidimensional Scaling},

year = {}

}

### OpenURL

### Abstract

We consider the non-metric multidimensional scaling problem: given a set of dissimilarities ∆, find an embedding whose inter-point Euclidean distances have the same ordering as ∆. In this paper, we look at a generalization of this problem in which only a set of order relations of the form dij < dkl are provided. Unlike the original problem, these order relations can be contradictory and need not be specified for all pairs of dissimilarities. We argue that this setting is more natural in some experimental settings and propose an algorithm based on convex optimization techniques to solve this problem. We apply this algorithm to human subject data from a psychophysics experiment concerning how reflectance properties are perceived. We also look at the standard NMDS problem, where a dissimilarity matrix ∆ is provided as input, and show that we can always find an orderrespecting embedding of ∆. 1

### Citations

830 | Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones
- Sturm
- 1999
(Show Context)
Citation Context ...− 2kkl + kll − kii + 2kij − kjj ≥ 1 − ξijkl � kab = 0, K � 0. (GNMDS) ab This program can be solved using a general purpose semidefinite programming package. We obtained the best results using Sedumi =-=[14]-=-. Unfortunately, general purpose semidefinite programming solvers scale poorly with the problem size. To handle larger problem sizes, we implemented a first order alternating projections algorithm [3,... |

540 | Distance metric learning with applications to clustering with side information
- Xing, Ng, et al.
- 2003
(Show Context)
Citation Context ...omparisons In this section we present a novel algorithm for learning a low rank embedding from a collection of paired comparisons. Our method is related to the recent work on distance metric learning =-=[11, 15, 18, 20]-=-. Let S be a set of 4-tuples (i, j, k, l). We hope to find an embedding X such that �xi − xj� 2 2 ≤ �xk − xl� 2 2 ∀ (i, j, k, l) ∈ S. (7) The set S can have repetitions and inconsistencies. We now der... |

484 |
M.: Zippered polygon meshes from range images
- TURK, LEVOY
- 1994
(Show Context)
Citation Context ...) Screen capture from the distance comparison test. (b) Six of the 55 images used in our psychophysics study. shown a series of triplets of rendered images. Each image consisted of the Stanford Bunny =-=[16]-=- rendered under constant illumination and viewing direction. The only difference was the material/BRDF used to describe how the surface of the bunny reflects light. For each triplet the subjects were ... |

440 |
Modern Multidimensional Scaling: Theory and Applications
- Borg, Groenen
- 2005
(Show Context)
Citation Context ...δkl. (1)sThis problem was first considered by Shepard [12, 13], but it was Kruskal who posed the problem as an optimization problem and introduced an alternating minimization procedure for solving it =-=[8, 7, 2, 1]-=-. We review the Shepard-Kruskal algorithm in the next section. A curiosity of the the Shepard-Kruskal formulation of non-metric MDS is that it actually requires magnitudes as input, even though NMDS c... |

386 |
Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis
- Kruskal
- 1964
(Show Context)
Citation Context ...δkl. (1)sThis problem was first considered by Shepard [12, 13], but it was Kruskal who posed the problem as an optimization problem and introduced an alternating minimization procedure for solving it =-=[8, 7, 2, 1]-=-. We review the Shepard-Kruskal algorithm in the next section. A curiosity of the the Shepard-Kruskal formulation of non-metric MDS is that it actually requires magnitudes as input, even though NMDS c... |

358 | Distance metric learning for large margin nearest neighbor classification
- Weinberger, Blitzer, et al.
- 2005
(Show Context)
Citation Context ...omparisons In this section we present a novel algorithm for learning a low rank embedding from a collection of paired comparisons. Our method is related to the recent work on distance metric learning =-=[11, 15, 18, 20]-=-. Let S be a set of 4-tuples (i, j, k, l). We hope to find an embedding X such that �xi − xj� 2 2 ≤ �xk − xl� 2 2 ∀ (i, j, k, l) ∈ S. (7) The set S can have repetitions and inconsistencies. We now der... |

353 |
Rank Correlation Methods
- Kendall
- 1948
(Show Context)
Citation Context ... depending on the order in which the stimuli where presented. Thus, the actual numbers that the users give are typically not reliable; however, the relative ordering of them will be fairly consistent =-=[6]-=-. Because of the inconsistency of user ratings, non-metric MDS is more appropriate in this setting than metric MDS. Let us formalize the standard non-metric MDS problem. Problem 1 (Shepard-Kruskal Sca... |

273 |
The Analysis of Proximities: Multidimensional Scaling with an Unknown Distance Function
- Shepard
- 1962
(Show Context)
Citation Context ...l Scaling). Given a symmetric zero diagonal matrix ∆ = [δij], find X = [xi] ∈ R d×n such that ∀ i, j, k, l �xi − xj� 2 2 < �xk − xl� 2 2 ⇐⇒ δij < δkl. (1)sThis problem was first considered by Shepard =-=[12, 13]-=-, but it was Kruskal who posed the problem as an optimization problem and introduced an alternating minimization procedure for solving it [8, 7, 2, 1]. We review the Shepard-Kruskal algorithm in the n... |

258 |
Nonmetric multidimensional scaling: a numerical method. Psychometrika
- Kruskal
- 1964
(Show Context)
Citation Context ...δkl. (1)sThis problem was first considered by Shepard [12, 13], but it was Kruskal who posed the problem as an optimization problem and introduced an alternating minimization procedure for solving it =-=[8, 7, 2, 1]-=-. We review the Shepard-Kruskal algorithm in the next section. A curiosity of the the Shepard-Kruskal formulation of non-metric MDS is that it actually requires magnitudes as input, even though NMDS c... |

236 |
Least-squares estimation of transformation parameters between two point patterns
- Umeyama
- 1991
(Show Context)
Citation Context ...t embedding. For each embedding, we aligned the 54 points up with the corresponding points in the 55 point embedding via a similarity transformation. We then calculated the average squared distortion =-=[17]-=-. Note that paired comparisons are invariant to similarity transformations. To establish a scale for these errors, the average distance between pairs of points in the global embedding was calculated. ... |

232 |
Geometrical Considerations and Nomenclature for Reflectance
- Nicodemus, Richmond, et al.
- 1977
(Show Context)
Citation Context ...rectional Reflectance Distribution Function (BRDF) is a mathematical description of how a surface reflects light. For every incident direction it describes the angular distribution of reflected light =-=[10]-=-. The shape of these distributions determine whether a material appears rough and matte or shiny and metallic. To understand how humans perceive the reflection of light from different kinds of surface... |

150 | A data-driven reflectance model
- MATUSIK, PFISTER, et al.
- 2003
(Show Context)
Citation Context ... on the right (Figure 2(a) shows a screen-shot from one such test). The use of triplets in this manner is a special case of Problem 2, where j = l. A total of 55 BRDFs from the MIT/MERL BRDF database =-=[9]-=- were used. In this study we restrict our attention to the achromatic aspects of the BRDF, also known as gloss (examples of some of these BRDFs appear in Figure 2(b)). While monochromatic, they have w... |

133 | Learning a distance metric from relative comparisons
- Schultz, Joachims
- 2004
(Show Context)
Citation Context ...omparisons In this section we present a novel algorithm for learning a low rank embedding from a collection of paired comparisons. Our method is related to the recent work on distance metric learning =-=[11, 15, 18, 20]-=-. Let S be a set of 4-tuples (i, j, k, l). We hope to find an embedding X such that �xi − xj� 2 2 ≤ �xk − xl� 2 2 ∀ (i, j, k, l) ∈ S. (7) The set S can have repetitions and inconsistencies. We now der... |

71 |
An algorithm for restricted least squares regression
- Dykstra
- 1983
(Show Context)
Citation Context ...14]. Unfortunately, general purpose semidefinite programming solvers scale poorly with the problem size. To handle larger problem sizes, we implemented a first order alternating projections algorithm =-=[3, 5]-=-. Though this method converges much more slowly than interior-point based methods for moderately sized problems, it scales to large problem sizes and has minimal memory requirements. 4 Experiments 4.1... |

39 |
A succesive projection method
- Han
- 1988
(Show Context)
Citation Context ...14]. Unfortunately, general purpose semidefinite programming solvers scale poorly with the problem size. To handle larger problem sizes, we implemented a first order alternating projections algorithm =-=[3, 5]-=-. Though this method converges much more slowly than interior-point based methods for moderately sized problems, it scales to large problem sizes and has minimal memory requirements. 4 Experiments 4.1... |

31 | Rank minimization and applications in system theory
- Fazel, Hindi, et al.
- 2004
(Show Context)
Citation Context ...xity of our model. Unfortunately, the rank of a matrix is a non-convex function and minimizing the rank of a symmetric positive semidefinite matrix subject to linear inequality constraints is NP-Hard =-=[4]-=-. We thus relax the rank function to its convex envelope, the trace. This relaxation is standard in the convex programming literature. Using this relaxation has the additional benefit of constraining ... |

30 | H.: Distance metric learning with kernels
- Tsang, Kwok, et al.
(Show Context)
Citation Context |

1 |
Toward a perceptual space for reflectance. (in review
- Wills, Agarwal, et al.
(Show Context)
Citation Context ...ring the embedding produced by running the GNMDS algorithm on the data from the psychophysics experiment described in the previous 1 For additional details about the experiment we refer the reader to =-=[19]-=-. section. Figure 3(a) shows the 2-D embedding with cropped windows of the BRDF images displayed in the locations of the BRDF in the new space. Notice the clustering of the BRDFs into two distinct clu... |

1 |
Toward a perceptual space for reflectance
- Anonymous
(Show Context)
Citation Context ...mages corresponding to them in each clump. There are also two pronounced trends in the embedding, a vertical trend with the darker 1 For additional details about the experiment we refer the reader to =-=[1]-=-. BRDFs at the top gradually getting brighter with the brightest BRDFs at the bottom. Horizontally the embedding can be roughly broken up into two clusters: the primarily diffuse BRDFs and those that ... |