Results 1  10
of
764
A PTAS for the minimum consensus clustering problem with a fixed number of clusters
 In Proc. 11th ICTCS
, 2009
"... The Consensus Clustering problem has been introduced as an effective way to analyze the results of different microarray experiments [5, 6]. The problem consists of looking for a partition that best summarizes a set of input partitions (each corresponding to a different microarray experiment) under a ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The Consensus Clustering problem has been introduced as an effective way to analyze the results of different microarray experiments [5, 6]. The problem consists of looking for a partition that best summarizes a set of input partitions (each corresponding to a different microarray experiment) under
Laplacian eigenmaps and spectral techniques for embedding and clustering.
 Proceeding of Neural Information Processing Systems,
, 2001
"... Abstract Drawing on the correspondence between the graph Laplacian, the LaplaceBeltrami op erator on a manifold , and the connections to the heat equation , we propose a geometrically motivated algorithm for constructing a representation for data sampled from a low dimensional manifold embedded in ..."
Abstract

Cited by 668 (7 self)
 Add to MetaCart
in a higher dimensional space. The algorithm provides a computationally efficient approach to nonlinear dimensionality reduction that has locality preserving properties and a natural connection to clustering. Several applications are considered. In many areas of artificial intelligence, information
Aggregating inconsistent information: ranking and clustering
, 2005
"... We address optimization problems in which we are given contradictory pieces of input information and the goal is to find a globally consistent solution that minimizes the number of disagreements with the respective inputs. Specifically, the problems we address are rank aggregation, the feedback arc ..."
Abstract

Cited by 226 (17 self)
 Add to MetaCart
set problem on tournaments, and correlation and consensus clustering. We show that for all these problems (and various weighted versions of them), we can obtain improved approximation factors using essentially the same remarkably simple algorithm. Additionally, we almost settle a long
Incremental Clustering and Dynamic Information Retrieval
, 1997
"... Motivated by applications such as document and image classification in information retrieval, we consider the problem of clustering dynamic point sets in a metric space. We propose a model called incremental clustering which is based on a careful analysis of the requirements of the information retri ..."
Abstract

Cited by 191 (4 self)
 Add to MetaCart
incremental clustering algorithms which have a provably good performance. We complement our positive results with lower bounds on the performance of incremental algorithms. Finally, we consider the dual clustering problem where the clusters are of fixed diameter, and the goal is to minimize the number
Correlation clustering with a fixed number of clusters
 Theory of Computing
"... Abstract: We continue the investigation of problems concerning correlation clustering or clustering with qualitative information, which is a clustering formulation that has been studied recently (Bansal, Blum, Chawla (2004), Charikar, Guruswami, Wirth (FOCS’03), Charikar, Wirth (FOCS’04), Alon et al ..."
Abstract

Cited by 38 (0 self)
 Add to MetaCart
Abstract: We continue the investigation of problems concerning correlation clustering or clustering with qualitative information, which is a clustering formulation that has been studied recently (Bansal, Blum, Chawla (2004), Charikar, Guruswami, Wirth (FOCS’03), Charikar, Wirth (FOCS’04), Alon et
Consensus clustering + meta clustering = multiple consensus clustering
 Florida Artificial Intelligence Research Society Conference
, 2011
"... Consensus clustering and meta clustering are two important extensions of the classical clustering problem. Given a set of input clusterings of a given dataset, consensus clustering aims to find a single final clustering which is a better fit in some sense than the existing clusterings, and meta clus ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Consensus clustering and meta clustering are two important extensions of the classical clustering problem. Given a set of input clusterings of a given dataset, consensus clustering aims to find a single final clustering which is a better fit in some sense than the existing clusterings, and meta
Clustering with qualitative information
 In Proceedings of the 44th Annual IEEE Symposium on Foundations of Computer Science
, 2003
"... We consider the problem of clustering a collection of elements based on pairwise judgments of similarity and dissimilarity. Bansal, Blum and Chawla [1] cast the problem thus: given a graph G whose edges are labeled “+ ” (similar) or “− ” (dissimilar), partition the vertices into clusters so that ..."
Abstract

Cited by 122 (9 self)
 Add to MetaCart
We consider the problem of clustering a collection of elements based on pairwise judgments of similarity and dissimilarity. Bansal, Blum and Chawla [1] cast the problem thus: given a graph G whose edges are labeled “+ ” (similar) or “− ” (dissimilar), partition the vertices into clusters so
A Polynomial Time Approximation Scheme for kConsensus Clustering
, 2010
"... This paper introduces a polynomial time approximation scheme for the metric Correlation Clustering problem, when the number of clusters returned is bounded (by k). Consensus Clustering is a fundamental aggregation problem, with considerable application, and it is analysed here as a metric variant of ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
of the Correlation Clustering problem. The PTAS exploits a connection between Correlation Clustering and the kcut problems. This requires the introduction of a new rebalancing technique, based on minimum cost perfect matchings, to provide clusters of the required sizes. Both Consensus Clustering and Correlation
Clustering ensembles: Models of consensus and weak partitions
 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 2005
"... Clustering ensembles have emerged as a powerful method for improving both the robustness as well as the stability of unsupervised classification solutions. However, finding a consensus clustering from multiple partitions is a difficult problem that can be approached from graphbased, combinatorial ..."
Abstract

Cited by 85 (3 self)
 Add to MetaCart
Clustering ensembles have emerged as a powerful method for improving both the robustness as well as the stability of unsupervised classification solutions. However, finding a consensus clustering from multiple partitions is a difficult problem that can be approached from graphbased, combinatorial
Parallel Preconditioning with Sparse Approximate Inverses
 SIAM J. Sci. Comput
, 1996
"... A parallel preconditioner is presented for the solution of general sparse linear systems of equations. A sparse approximate inverse is computed explicitly, and then applied as a preconditioner to an iterative method. The computation of the preconditioner is inherently parallel, and its application o ..."
Abstract

Cited by 226 (10 self)
 Add to MetaCart
only requires a matrixvector product. The sparsity pattern of the approximate inverse is not imposed a priori but captured automatically. This keeps the amount of work and the number of nonzero entries in the preconditioner to a minimum. Rigorous bounds on the clustering of the eigenvalues
Results 1  10
of
764