## Locally linear metric adaptation with application to semi-supervised clustering and image retrieval (2005)

### Cached

### Download Links

- [www.cs.ust.hk]
- [www.cs.ust.hk]
- [www.cse.ust.hk]
- [www.cse.ust.hk]
- [www.cse.ust.hk]
- [www.cse.ust.hk]
- [www.jdl.ac.cn]
- [www.cs.ust.hk]
- DBLP

### Other Repositories/Bibliography

Citations: | 4 - 0 self |

### BibTeX

@MISC{Chang05locallylinear,

author = {Hong Chang and Dit-Yan Yeung},

title = {Locally linear metric adaptation with application to semi-supervised clustering and image retrieval},

year = {2005}

}

### OpenURL

### Abstract

### Citations

1716 | Nonlinear Dimensionality Reduction by Locally Linear Embedding
- Roweis, LK
- 2000
(Show Context)
Citation Context ...o achieve this goal. One possibility is to preserve the locally linear relationships between nearest neighbors, as in a nonlinear dimensionality reduction method called locally linear embedding (LLE) =-=[17]-=-. Specifically, we seek to find the best reconstruction weights for all data points, represented as an n×n weight matrix W = [wij], by minimizing the following cost function E = � �xi − � i x j ∈N i w... |

1237 | Content-based image retrieval at the end of the early years
- Smeulders, Worring, et al.
(Show Context)
Citation Context ...ed popularity of the World Wide Web (WWW) over the past decade, retrieval of images based on content, often referred to as content-based image retrieval (CBIR), has gained a lot of research interests =-=[22]-=-. The two determining factors for CBIR performance are the features used to represent the images and the distance function used to measure the similarity between a query image and the images in the da... |

695 | Networks for Approximation and Learning - Poggio, Girosi - 1990 |

540 | Relevance feedback: a power tool for interactive content-based image retrieval
- Rui, Huang, et al.
- 1998
(Show Context)
Citation Context ...raditional information retrieval community to improve the performance of information retrieval systems based on user feedback. This interactive approach has also emerged as a popular approach in CBIR =-=[18]-=-. The user is provided with the option of labeling (some of the) previously retrieved images as either relevant or irrelevant. Based on this feedback information, the CBIR system can iteratively refin... |

540 | Distance metric learning with applications to clustering with side information
- Xing, Ng, et al.
- 2003
(Show Context)
Citation Context ...pairwise constraints, so that the pairwise constraints can also have influence on the neighboring data points. However, both methods do not incorporate metric learning into the clustering algorithms. =-=[26]-=- proposed using pairwise side information in a novel way to learn a global Mahalanobis metric before performing clustering with constraints. Both Klein et al.’s and Xing et al.’s 1 Semi-supervised clu... |

489 |
Objective criteria for evaluation of clustering methods
- Rand
- 1971
(Show Context)
Citation Context ...ns without metric learning 3. k-means with RCA for metric learning 4. k-means with LLMA for metric learning (gradient method) 5. k-means with LLMA for metric learning (spectral method) The Rand index =-=[16]-=- is used to measure the clustering quality in our experiments. It reflects the agreement of the clustering result with the ground truth. Let ns be the number of point pairs that are assigned to the sa... |

350 | Constrained k-means clustering with background knowledge
- Wagstaff, Cardie, et al.
- 2001
(Show Context)
Citation Context ...similarity or dissimilarity constraints. This type of supervisory information is weaker than the first type, in that pairwise constraints can be derived from labeled data but not vice versa. [23] and =-=[24]-=- proposed using such pairwise constraints to improve clustering results. [12] introduced spatial generalizations to pairwise constraints, so that the pairwise constraints can also have influence on th... |

258 | Discriminant adaptive nearest neighbor classification and regression
- Hastie, Tibshirani
- 1996
(Show Context)
Citation Context ... metric [7] were also developed for nearest neighbor classification. More recent research along this line continued to develop various locally adaptive metrics for nearest neighbor classifiers, e.g., =-=[6, 14, 8, 5]-=-. Besides nearest neighbor classifiers, there are other methods that also perform metric learning based on nearest neighbors, e.g., radial basis function networks and variants. While class label infor... |

191 | Comparing images using color coherence vec tors
- Pass, Zabih, et al.
- 1996
(Show Context)
Citation Context ...ange of variations from the smallest class with 24 images to the largest class with 125 images. We first represent the images in the HSV color space, and then compute the color coherence vector (CCV) =-=[15]-=- as the feature vector for each image. Specifically, we quantize each image to 8 × 8 × 8 color bins, and then represent the image as a 1024-dimensional CCV (α1, β1, . . . , α512, β512) T , with αi and... |

164 | From instance-level constraints to space-level constraints: Making the most of prior knowledge in data clustering
- Klein, Kamvar, et al.
- 2002
(Show Context)
Citation Context ... is weaker than the first type, in that pairwise constraints can be derived from labeled data but not vice versa. [23] and [24] proposed using such pairwise constraints to improve clustering results. =-=[12]-=- introduced spatial generalizations to pairwise constraints, so that the pairwise constraints can also have influence on the neighboring data points. However, both methods do not incorporate metric le... |

161 | Intelligent Clustering with InstanceLevel Constraints
- Wagstaff
(Show Context)
Citation Context ...pairwise similarity or dissimilarity constraints. This type of supervisory information is weaker than the first type, in that pairwise constraints can be derived from labeled data but not vice versa. =-=[23]-=- and [24] proposed using such pairwise constraints to improve clustering results. [12] introduced spatial generalizations to pairwise constraints, so that the pairwise constraints can also have influe... |

159 | Semi-supervised clustering by seeding - Basu, Banerjee, et al. - 2002 |

139 | Learning distance functions using equivalence relations
- Bar-Hillel, Hertz, et al.
- 2003
(Show Context)
Citation Context ...s semi-supervised classification with the aid of unlabeled data.smethods generally outperform Wagstaff et al.’s method in the experiments reported. Instead of using an iterative algorithm as in [26], =-=[1]-=- devised a more efficient, non-iterative algorithm called relevant component analysis (RCA) for learning a global Mahalanobis metric. However, their method can only incorporate similarity constraints.... |

123 | Flexible metric nearest neighbor classi�cation. http:��playfair.stanford.edu�reports�friedman
- Friedman
- 1994
(Show Context)
Citation Context ... metric [7] were also developed for nearest neighbor classification. More recent research along this line continued to develop various locally adaptive metrics for nearest neighbor classifiers, e.g., =-=[6, 14, 8, 5]-=-. Besides nearest neighbor classifiers, there are other methods that also perform metric learning based on nearest neighbors, e.g., radial basis function networks and variants. While class label infor... |

114 | Similarity metric learning for a variablekernel classifier
- Lowe
- 1995
(Show Context)
Citation Context ... metric [7] were also developed for nearest neighbor classification. More recent research along this line continued to develop various locally adaptive metrics for nearest neighbor classifiers, e.g., =-=[6, 14, 8, 5]-=-. Besides nearest neighbor classifiers, there are other methods that also perform metric learning based on nearest neighbors, e.g., radial basis function networks and variants. While class label infor... |

83 | Computing Gaussian mixture models with EM using equivalence constraints
- Shental, Bar-Hillel, et al.
- 2003
(Show Context)
Citation Context ...devised a more efficient, non-iterative algorithm called relevant component analysis (RCA) for learning a global Mahalanobis metric. However, their method can only incorporate similarity constraints. =-=[19]-=- extended the work of [1] by incorporating both pairwise similarity and dissimilarity constraints into the expectation-maximization (EM) algorithm for model-based clustering based on Gaussian mixture ... |

80 | Clustering based on conditional distribution in an auxiliary space
- Sinkkonen, Kaski
(Show Context)
Citation Context ...ssification which solves the classification problem with the aid of additional unlabeled data. One type of supervisory information is in the form of limited labeled data. 1 Based on such information, =-=[21]-=- proposed a local metric learning method to improve clustering and visualization results. [2] explored using labeled data to generate initial seed clusters for the k-means clustering algorithm. Also, ... |

78 | Locally adaptive metric nearest neighbour classification
- Domeniconi, Peng, et al.
- 2002
(Show Context)
Citation Context |

71 |
The optimal distance measure for nearest neighbor classification
- Short, Fukunago
- 1981
(Show Context)
Citation Context ...ng Kong University of Science and Technology Clear Water Bay, Kowloon, Hong Kong dyyeung@cs.ust.hk work on optimizing the metric for k-nearest neighbor density estimation. Later, optimal local metric =-=[20]-=- and optimal global metric [7] were also developed for nearest neighbor classification. More recent research along this line continued to develop various locally adaptive metrics for nearest neighbor ... |

55 | Learning Distance Functions for Image Retrieval
- Hertz, Bar-Hillel, et al.
- 2004
(Show Context)
Citation Context ...challenging new direction has aroused great interest in the research community. In particular, RCA [1, 11] has been used to improve image retrieval performance in CBIR tasks. More recently, DistBoost =-=[9, 10]-=-, a nonmetric distance learning method that makes use of the pairwise constraints and performs boosting, also demonstrated very good image retrieval results in CBIR tasks. In this section, we will app... |

55 |
Learning with idealized kernels
- Kwok, Tsang
- 2003
(Show Context)
Citation Context ...d the work of [1] by incorporating both pairwise similarity and dissimilarity constraints into the expectation-maximization (EM) algorithm for model-based clustering based on Gaussian mixture models. =-=[13]-=- established the relationship between metric learning and kernel matrix adaptation. To summarize, we can categorize metric learning methods according to two different dimensions. The first dimension i... |

40 | Boosting margin based distance function for clustering
- Hertz, Bar-Hillel, et al.
- 2004
(Show Context)
Citation Context ...challenging new direction has aroused great interest in the research community. In particular, RCA [1, 11] has been used to improve image retrieval performance in CBIR tasks. More recently, DistBoost =-=[9, 10]-=-, a nonmetric distance learning method that makes use of the pairwise constraints and performs boosting, also demonstrated very good image retrieval results in CBIR tasks. In this section, we will app... |

31 | Enhancing image and video retrieval: Learning via equivalence constraints
- Hertz, Shental, et al.
- 2003
(Show Context)
Citation Context ...more promising approach is to learn a good distance function from data automatically. Recently, this challenging new direction has aroused great interest in the research community. In particular, RCA =-=[1, 11]-=- has been used to improve image retrieval performance in CBIR tasks. More recently, DistBoost [9, 10], a nonmetric distance learning method that makes use of the pairwise constraints and performs boos... |

25 | Optimization of k-nearest-neighbor density estimates - Fukunaga, Hostetler - 1973 |

24 | Locally Linear Metric Adaptation for Semi-supervised Clustering
- Chang, Yeung
- 2004
(Show Context)
Citation Context ...h the penalty term P playing the role of an internal energy term. The optimization problem formulated above can be solved in an iterative manner, resulting in an iterative metric adaptation procedure =-=[3]-=-. In [3], we decrease the Gaussian window parameters ω and σ determining the neighborhood size and the weights in the penalty term, respectively, over time. In this way, the local specificity is incre... |

24 |
Multidimensional scaling by iterative majorization using radial basis functions
- Webb
- 1995
(Show Context)
Citation Context ...ion criterion is � � αij(qij(L) − pij) 2 , i jswhere αij = sij + λNσ(dij) pij = λNσ(dij) sij + λNσ(dij) dij. Since this form is the same as that for multidimensional scaling for discriminant analysis =-=[25]-=-, we can solve the optimization problem by iterative majorization, which can be seen as an EM-like algorithm for problems with no missing data. We define C = � � αij(πi − πj)(πi − πj) T and with i j D... |

23 |
An optimal global nearest neighbor metric
- Fukunaga, Flick
- 1984
(Show Context)
Citation Context ...nd Technology Clear Water Bay, Kowloon, Hong Kong dyyeung@cs.ust.hk work on optimizing the metric for k-nearest neighbor density estimation. Later, optimal local metric [20] and optimal global metric =-=[7]-=- were also developed for nearest neighbor classification. More recent research along this line continued to develop various locally adaptive metrics for nearest neighbor classifiers, e.g., [6, 14, 8, ... |

22 | Parametric distance metric learning with label information
- Zhang, Kwok, et al.
- 2003
(Show Context)
Citation Context ... proposed a local metric learning method to improve clustering and visualization results. [2] explored using labeled data to generate initial seed clusters for the k-means clustering algorithm. Also, =-=[27]-=- proposed a parametric distance metric learning method for both classification and clustering tasks. Another type of supervisory information is in the form of pairwise similarity or dissimilarity cons... |

15 | Adaptive kernel metric nearest neighbor classification - Peng, Heisterkamp, et al. - 2002 |

8 | On deformable models for visual pattern recognition
- Cheung, Yeung, et al.
- 2002
(Show Context)
Citation Context ... objective function for the optimization problem. Note that the optimization criterion in (3) is analogous to objective functions commonly used in energy minimization models such as deformable models =-=[4]-=-, with the penalty term P playing the role of an internal energy term. The optimization problem formulated above can be solved in an iterative manner, resulting in an iterative metric adaptation proce... |