## Polynomial Time Approximation Schemes for Geometric k-Clustering (2001)

### Cached

### Download Links

- [www.argreenhouse.com]
- [www.cs.technion.ac.il]
- DBLP

### Other Repositories/Bibliography

Venue: | J. OF THE ACM |

Citations: | 30 - 5 self |

### BibTeX

@INPROCEEDINGS{Ostrovsky01polynomialtime,

author = {Rafail Ostrovsky and Yuval Rabani},

title = {Polynomial Time Approximation Schemes for Geometric k-Clustering},

booktitle = {J. OF THE ACM},

year = {2001},

pages = {349--358},

publisher = {IEEE}

}

### Years of Citing Articles

### OpenURL

### Abstract

The Johnson-Lindenstrauss lemma states that n points in a high dimensional Hilbert space can be embedded with small distortion of the distances into an O(log n) dimensional space by applying a random linear transformation. We show that similar (though weaker) properties hold for certain random linear transformations over the Hamming cube. We use these transformations to solve NP-hard clustering problems in the cube as well as in geometric settings. More specifically, we address the following clustering problem. Given n points in a larger set (for example, R^d) endowed with a distance function (for example, L² distance), we would like to partition the data set into k disjoint clusters, each with a "cluster center", so as to minimize the sum over all data points of the distance between the point and the center of the cluster containing the point. The problem is provably NP-hard in some high dimensional geometric settings, even for k = 2. We give polynomial time approximation schemes for this problem in several settings, including the binary cube {0, 1}^d with Hamming distance, and R^d either with L¹ distance, or with L² distance, or with the square of L² distance. In all these settings, the best previous results were constant factor approximation guarantees. We note that our problem is similar in flavor to the k-median problem (and the related facility location problem), which has been considered in graph-theoretic and fixed dimensional geometric settings, where it becomes hard when k is part of the input. In contrast, we study the problem when k is fixed, but the dimension is part of the input.