• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 119
Next 10 →

Flexible Priors for Exemplar-based Clustering

by Daniel Tarlow, Richard S. Zemel, Brendan J. Frey
"... Exemplar-based clustering methods have been shown to produce state-of-the-art results on a number of synthetic and real-world clustering problems. They are appealing because they offer computational benefits over latent-mean models and can handle arbitrary pairwise similarity measures between data p ..."
Abstract - Cited by 10 (3 self) - Add to MetaCart
Exemplar-based clustering methods have been shown to produce state-of-the-art results on a number of synthetic and real-world clustering problems. They are appealing because they offer computational benefits over latent-mean models and can handle arbitrary pairwise similarity measures between data

Recovery guarantees for exemplar-based clustering. arXiv preprint arXiv:1309.3256

by Abhinav Nellore, Rachel Ward , 2013
"... For a certain class of distributions, we prove that the linear programming relaxation of k-medoids clustering—a variant of k-means clustering where means are replaced by exemplars from within the dataset—distinguishes points drawn from nonoverlapping balls with high prob-ability once the number of p ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
For a certain class of distributions, we prove that the linear programming relaxation of k-medoids clustering—a variant of k-means clustering where means are replaced by exemplars from within the dataset—distinguishes points drawn from nonoverlapping balls with high prob-ability once the number

Convex clustering with exemplar-based models

by Danial Lashkari, Polina Golland - In Advances in Neural Information Processing Systems (NIPS , 2007
"... Clustering is often formulated as the maximum likelihood estimation of a mixture model that explains the data. The EM algorithm widely used to solve the resulting optimization problem is inherently a gradient-descent method and is sensitive to initialization. The resulting solution is a local optimu ..."
Abstract - Cited by 37 (0 self) - Add to MetaCart
optimum in the neighborhood of the initial guess. This sensitivity to initialization presents a significant challenge in clustering large data sets into many clusters. In this paper, we present a different approach to approximate mixture fitting for clustering. We introduce an exemplar-based likelihood

Clustering by passing messages between data points

by Brendan J. Frey, Delbert Dueck - Science , 2007
"... Clustering data by identifying a subset of representative examples is important for processing sensory signals and detecting patterns in data. Such “exemplars ” can be found by randomly choosing an initial subset of data points and then iteratively refining it, but this works well only if that initi ..."
Abstract - Cited by 696 (8 self) - Add to MetaCart
Clustering data by identifying a subset of representative examples is important for processing sensory signals and detecting patterns in data. Such “exemplars ” can be found by randomly choosing an initial subset of data points and then iteratively refining it, but this works well only

Exemplar-based Robust Coherent Biclustering

by Kewei Tu, Xixiu Ouyang, Dingyi Han, Yong Yu, Vasant Honavar
"... The biclustering, co-clustering, or subspace clustering problem involves simultaneously grouping the rows and columns of a data matrix to uncover biclusters or sub-matrices of the data matrix that optimize a desired objective function. In coherent biclustering, the objective function contains a cohe ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
. A distinguishing feature of these algorithms is that they identify an exemplar or a prototypical member of each bi-cluster. We note the interference from background elements in bi-clustering, and offer a means to circumvent such interference using additional regularization. Our experiments

A Decoupled Approach to Exemplar-based Unsupervised Learning

by Sebastian Nowozin, Gökhan Bakır
"... A recent trend in exemplar based unsupervised learning is to formulate the learning problem as a convex optimization problem. Convexity is achieved by restricting the set of possible prototypes to training exemplars. In particular, this has been done for clustering, vector quantization and mixture m ..."
Abstract - Cited by 7 (0 self) - Add to MetaCart
A recent trend in exemplar based unsupervised learning is to formulate the learning problem as a convex optimization problem. Convexity is achieved by restricting the set of possible prototypes to training exemplars. In particular, this has been done for clustering, vector quantization and mixture

Outlier Detection with Globally Optimal Exemplar-Based GMM

by Xingwei Yang, Longin Jan, Latecki Dragoljub Pokrajac
"... Outlier detection has recently become an important problem in many data mining applications. In this paper, a novel unsupervised algorithm for outlier detection is proposed. First we apply a provably globally optimal Expectation Maximization (EM) algorithm to fit a Gaussian Mixture Model (GMM) to a ..."
Abstract - Cited by 5 (0 self) - Add to MetaCart
given data set. In our approach, a Gaussian is centered at each data point, and hence, the estimated mixture proportions can be interpreted as probabilities of being a cluster center for all data points. The outlier factor at each data point is then defined as a weighted sum of the mixture proportions

A bayesian, exemplar-based approach to hierarchical shape matching

by Dariu M. Gavrila - IEEE Trans. Pattern Anal. Mach. Intell
"... Abstract—This paper presents a novel probabilistic approach to hierarchical, exemplar-based shape matching. No feature correspondence is needed among exemplars, just a suitable pairwise similarity measure. The approach uses a template tree to efficiently represent and match the variety of shape exem ..."
Abstract - Cited by 74 (8 self) - Add to MetaCart
Abstract—This paper presents a novel probabilistic approach to hierarchical, exemplar-based shape matching. No feature correspondence is needed among exemplars, just a suitable pairwise similarity measure. The approach uses a template tree to efficiently represent and match the variety of shape

Pairwise Exemplar Clustering

by Yingzhen Yang, Xinqi Chu, Feng Liang, Thomas S. Huang
"... Exemplar-based clustering methods have been extensively shown to be effective in many clustering problems. They adaptively determine the number of clusters and hold the ap-pealing advantage of not requiring the estimation of latent pa-rameters, which is otherwise difficult in case of complicated par ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Exemplar-based clustering methods have been extensively shown to be effective in many clustering problems. They adaptively determine the number of clusters and hold the ap-pealing advantage of not requiring the estimation of latent pa-rameters, which is otherwise difficult in case of complicated

Convex Clustering with Exemplar-Based Models Anonymous Author(s) Affiliation

by unknown authors
"... Address email Clustering is often formulated as the maximum likelihood estimation of a mixture model that explains the data. The EM algorithm widely used to solve the resulting optimization problem is inherently a gradient-descent method and is sensitive to initialization. The resulting solution is ..."
Abstract - Add to MetaCart
is a local optimum in the neighborhood of the initial guess. This sensitivity to initialization presents a significant challenge in clustering large data sets into many clusters. In this paper, we present a different approach to approximate mixture fitting for clustering. We introduce an exemplar-based
Next 10 →
Results 1 - 10 of 119
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University