Results 1  10
of
1,111,852
Distance Metric Learning, With Application To Clustering With SideInformation
 ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 15
, 2003
"... Many algorithms rely critically on being given a good metric over their inputs. For instance, data can often be clustered in many "plausible" ways, and if a clustering algorithm such as Kmeans initially fails to find one that is meaningful to a user, the only recourse may be for the us ..."
Abstract

Cited by 799 (14 self)
 Add to MetaCart
examples. In this paper, we present an algorithm that, given examples of similar (and, if desired, dissimilar) pairs of points in R , learns a distance metric over R that respects these relationships. Our method is based on posing metric learning as a convex optimization problem, which allows us
Distance Metric Learning with Kernels
 Proceedings of the International Conference on Artificial Neural Networks
, 2003
"... In this paper, we propose a feature weighting method that works in both the input space and the kernelinduced feature space. It assumes only the availability of similarity (dissimilarity) information, and the number of parameters in the transformation does not depend on the number of features. Besi ..."
Abstract

Cited by 38 (1 self)
 Add to MetaCart
In this paper, we propose a feature weighting method that works in both the input space and the kernelinduced feature space. It assumes only the availability of similarity (dissimilarity) information, and the number of parameters in the transformation does not depend on the number of features. Besides feature weighting, it can also be regarded as performing nonparametric kernel adaptation. Experimental results on both toy and realworld datasets show promising results.
Distance metric learning for large margin nearest neighbor classification
 In NIPS
, 2006
"... We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven ..."
Abstract

Cited by 685 (15 self)
 Add to MetaCart
We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin
Bayesian active distance metric learning
 UAI
, 2007
"... Distance metric learning is an important component for many tasks, such as statistical classification and contentbased image retrieval. Existing approaches for learning distance metrics from pairwise constraints typically suffer from two major problems. First, most algorithms only offer point estim ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
Distance metric learning is an important component for many tasks, such as statistical classification and contentbased image retrieval. Existing approaches for learning distance metrics from pairwise constraints typically suffer from two major problems. First, most algorithms only offer point
Hamming Distance Metric Learning
"... Motivated by largescale multimedia applications we propose to learn mappings from highdimensional data to binary codes that preserve semantic similarity. Binary codes are well suited to largescale applications as they are storage efficient and permit exact sublinear kNN search. The framework is ..."
Abstract

Cited by 31 (3 self)
 Add to MetaCart
Motivated by largescale multimedia applications we propose to learn mappings from highdimensional data to binary codes that preserve semantic similarity. Binary codes are well suited to largescale applications as they are storage efficient and permit exact sublinear kNN search. The framework
Distance Metric Learning with Kernels
"... Abstract — In this paper, we propose a feature weighting method that works in both the input space and the kernelinduced feature space. It assumes only the availability of similarity (dissimilarity) information, and the number of parameters in the transformation does not depend on the number of fea ..."
Abstract
 Add to MetaCart
Abstract — In this paper, we propose a feature weighting method that works in both the input space and the kernelinduced feature space. It assumes only the availability of similarity (dissimilarity) information, and the number of parameters in the transformation does not depend on the number of features. Besides feature weighting, it can also be regarded as performing nonparametric kernel adaptation. Experimental results on both toy and realworld datasets show promising results. I.
Adaptive Distance Metric Learning for Clustering
"... A good distance metric is crucial for unsupervised learning from highdimensional data. To learn a metric without any constraint or class label information, most unsupervised metric learning algorithms appeal to projecting observed data onto a lowdimensional manifold, where geometric relationships ..."
Abstract
 Add to MetaCart
A good distance metric is crucial for unsupervised learning from highdimensional data. To learn a metric without any constraint or class label information, most unsupervised metric learning algorithms appeal to projecting observed data onto a lowdimensional manifold, where geometric relationships
Accuracy of Distance Metric Learning Algorithms
"... In this paper, we wanted to compare distance metriclearning algorithms on UCI datasets. We wanted to assess the accuracy of these algorithms in many situations, perhaps some that they were not initially designed for. We looked for many algorithms and chose four of them based on our criteria. We al ..."
Abstract
 Add to MetaCart
In this paper, we wanted to compare distance metriclearning algorithms on UCI datasets. We wanted to assess the accuracy of these algorithms in many situations, perhaps some that they were not initially designed for. We looked for many algorithms and chose four of them based on our criteria. We
MultiModal Distance Metric Learning∗
"... Multimodal data is dramatically increasing with the fast growth of social media. Learning a good distance measure for data with multiple modalities is of vital importance for many applications, including retrieval, clustering, classification and recommendation. In this paper, we propose an effec ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
tive and scalable multimodal distance metric learning framework. Based on the multiwing harmonium model, our method provides a principled way to embed data of arbitrary modalities into a single latent space, of which an optimal distance metric can be learned under proper supervision, i.e., by minimizing
Bayesian Multitask Distance Metric Learning
"... We present a Bayesian approach for jointly learning distance metrics for a large collection of potentially related learning tasks. We assume there exists a relatively smaller set of basis distance metrics and the distance metric for each task is a sparse, positively weighted combination of these ba ..."
Abstract
 Add to MetaCart
We present a Bayesian approach for jointly learning distance metrics for a large collection of potentially related learning tasks. We assume there exists a relatively smaller set of basis distance metrics and the distance metric for each task is a sparse, positively weighted combination
Results 1  10
of
1,111,852