## Improved approximation algorithms for large matrices via random projections

### Download From

IEEE### Download Links

- [www.ilab.sztaki.hu]
- [www.ilab.sztaki.hu]
- [www.ilab.sztaki.hu]
- DBLP

### Other Repositories/Bibliography

Venue: | in Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science |

Citations: | 93 - 3 self |

### BibTeX

@INPROCEEDINGS{Sarlós_improvedapproximation,

author = {Tamás Sarlós},

title = {Improved approximation algorithms for large matrices via random projections},

booktitle = {in Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science},

year = {},

pages = {143--152}

}

### Years of Citing Articles

### OpenURL

### Abstract

Recently several results appeared that show significant reduction in time for matrix multiplication, singular value decomposition as well as linear (ℓ2) regression, all based on data dependent random sampling. Our key idea is that low dimensional embeddings can be used to eliminate data dependence and provide more versatile, linear time pass efficient matrix computation. Our main contribution is summarized as follows. • Independent of the recent results of Har-Peled and of Deshpande and Vempala, one of the first – and to the best of our knowledge the most efficient – relative-error (1 + ɛ) ‖A − Ak‖F approximation algorithms for the singular value decomposition of an m × n matrix A with M non-zero entries that requires 2 passes over the data and runs in time O M k + (n + m)k2 ɛ ɛ2) log 1 δ • The first o(nd 2) time (1+ɛ) relative-error approximation algorithm for n×d linear (ℓ2) regression. • A matrix multiplication algorithm that easily applies to implicitly given matrices. 1