## Almost optimal unrestricted fast johnson-lindenstrauss transform (2003)

Venue: | Noga Alon. Problems and results in extremal combinatorics–i. Discrete Mathematics |

Citations: | 31 - 1 self |

### BibTeX

@INPROCEEDINGS{Ailon03almostoptimal,

author = {Nir Ailon and Edo Liberty},

title = {Almost optimal unrestricted fast johnson-lindenstrauss transform},

booktitle = {Noga Alon. Problems and results in extremal combinatorics–i. Discrete Mathematics},

year = {2003}

}

### OpenURL

### Abstract

The problems of random projections and sparse reconstruction have much in common and individually received much attention. Surprisingly, until now they progressed in parallel and remained mostly separate. Here, we employ new tools from probability in Banach spaces that were successfully used in the context of sparse reconstruction to advance on an open problem in random pojection. In particular, we generalize and use an intricate result by Rudelson and Vershynin for sparse reconstruction which uses Dudley’s theorem for bounding Gaussian processes. Our main result states that any set of N = exp ( Õ(n)) real vectors in n dimensional space can be linearly mapped to a space of dimension k = O(log N polylog(n)), while (1) preserving the pairwise distances among the vectors to within any constant distortion and (2) being able to apply the transformation in time O(n log n) on each vector. This improves on the best known N = exp ( Õ(n1/2)) achieved by Ailon and Liberty and N = exp ( Õ(n1/3)) by Ailon and Chazelle. The dependence in the distortion constant however is believed to be suboptimal and subject to further investigation. For constant distortion, this settles the open question posed by these authors up to a polylog(n) factor while considerably simplifying their constructions. 1

### Citations

1764 | Compressed sensing
- Donoho
- 2006
(Show Context)
Citation Context ...tions and sparse reconstruction is the preservation of metric information under a dimension reducing transformation. In sparse reconstruction theory, this property is known as restricted isometry [15]=-=[16]-=-. A matrix Φ is a restricted isometry with sparseness paramater r if for some δ > 0, (1.1) ∀ r-sparse y ∈ R n (1−δ)‖y‖ 2 2 ≤ ‖Φy‖ 2 2 ≤ (1+δ)‖y‖ 2 2 . By r-sparse y we mean vectors in R n with all but... |

1332 | Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Candès, J, et al.
(Show Context)
Citation Context ...ojections and sparse reconstruction is the preservation of metric information under a dimension reducing transformation. In sparse reconstruction theory, this property is known as restricted isometry =-=[15]-=-[16]. A matrix Φ is a restricted isometry with sparseness paramater r if for some δ > 0, (1.1) ∀ r-sparse y ∈ R n (1−δ)‖y‖ 2 2 ≤ ‖Φy‖ 2 2 ≤ (1+δ)‖y‖ 2 2 . By r-sparse y we mean vectors in R n with all... |

409 |
Extensions of lipschitz mapping into hilbert space
- Johnson, Lindenstrauss
- 1984
(Show Context)
Citation Context ...ndenstrauss’s original construction. It became synonymous with the process of approximate metric preserving dimension reduction using randomized linear mappings. However, these linear a suitable size =-=[1]-=-[2][3][4] result in optimal construction [5] in the parameters n (the original dimension), k (the target dimension), N (the number of input vectors) and δ (the distortion). However, these construction... |

285 |
The volume of convex bodies and Banach space geometry
- Pisier
- 1989
(Show Context)
Citation Context ... any two points z1, z2 ∈ B, ‖z1 − z2‖X = max i≤k |〈xi, z1 − z2〉| ≤ max i≤k ‖xi‖∞‖z1 − z2‖1 ≤ 2 , the last inequality from ‖Φ‖∞ = 1 together with our above assertion that B ⊆ B1. A volumetric argument =-=[19]-=- is used to then conclude that N (B, ‖ · ‖X, u) ≤ (1 + O(1/u)) n .Following Rudelson and Vershyni’s final step in [13], we derive a bound for the integral ∫ ∞ 0 N 1/2 (B, ‖ · ‖X, u)du by balancing th... |

211 | BFrom sparse solutions of systems of equations to sparse modeling of signals and images
- Bruckstein, Donoho, et al.
- 2009
(Show Context)
Citation Context ...r an excellent survey explaining how restricted isometry can be used for sparse reconstruction, and why designing such matrices with good computational properties is important we refer the readers to =-=[17]-=- and to references therein. Independently, Ailon and Chazelle [8] and Ailon and Liberty [10] were interested in constructing a distribution of k × n matrices Φ such that for any set Y ⊆ R n of cardina... |

150 |
Database-friendly random projections: Johnsonlindenstrauss with binary coins
- Achlioptas
- 2003
(Show Context)
Citation Context ...ss’s original construction. It became synonymous with the process of approximate metric preserving dimension reduction using randomized linear mappings. However, these linear a suitable size [1][2][3]=-=[4]-=- result in optimal construction [5] in the parameters n (the original dimension), k (the target dimension), N (the number of input vectors) and δ (the distortion). However, these constructions’ result... |

120 | On sparse reconstruction from Fourier and Gaussian measurements
- Rudelson, Vershynin
(Show Context)
Citation Context ...[10], a different notation was used. The number of vectors was n, the original dimension was d and the distortion parameter was ε. Here, we chose to follow the notation used by Rudelson and Vershynin =-=[13]-=-, since our construction and analysis closelyThe transformation we derive here is a composition of two random matrices: A random sign matrix and a random selection of a suitable number k of rows from... |

112 | An elementary proof of the Johnson-Lindenstrauss lemma,” Random Struct
- Dasgupta, Gupta
- 2005
(Show Context)
Citation Context ...rauss’s original construction. It became synonymous with the process of approximate metric preserving dimension reduction using randomized linear mappings. However, these linear a suitable size [1][2]=-=[3]-=-[4] result in optimal construction [5] in the parameters n (the original dimension), k (the target dimension), N (the number of input vectors) and δ (the distortion). However, these constructions’ res... |

102 |
Approximate nearest neighbors and the fast JohnsonLindenstrauss transform
- Ailon, Chazelle
- 2006
(Show Context)
Citation Context ...projection and compressed sensing have much in common, they have mostly progressed in parallel. Here we combine recent work on bounds for sparse reconstruction to improve bounds of Ailon and Chazelle =-=[8, 9]-=- and Ailon and Liberty [10] on fast random projections, also known as Fast JohnsonLindenstrauss transformations. The new bounds allow obtaining the well known Fast Johnson-Lindenstrauss Transform for ... |

98 |
The JohnsonLindenstrauss lemma and the sphericity of some graphs
- Fankl, Maehara
- 1988
(Show Context)
Citation Context ...nstrauss’s original construction. It became synonymous with the process of approximate metric preserving dimension reduction using randomized linear mappings. However, these linear a suitable size [1]=-=[2]-=-[3][4] result in optimal construction [5] in the parameters n (the original dimension), k (the target dimension), N (the number of input vectors) and δ (the distortion). However, these constructions’ ... |

93 | Improved approximation algorithms for large matrices via random projections
- Sarlos
- 2006
(Show Context)
Citation Context ... dependence in the parameters n, k, N and δ. Applications for such transformations were found e.g. in designing fast approximation algorithms for solving large scale linear algebraic operations (e.g. =-=[6, 7]-=-). Although random projection and compressed sensing have much in common, they have mostly progressed in parallel. Here we combine recent work on bounds for sparse reconstruction to improve bounds of ... |

42 | Fast dimension reduction using rademacher series on dual bch codes
- Ailon, Liberty
(Show Context)
Citation Context ...sing have much in common, they have mostly progressed in parallel. Here we combine recent work on bounds for sparse reconstruction to improve bounds of Ailon and Chazelle [8, 9] and Ailon and Liberty =-=[10]-=- on fast random projections, also known as Fast JohnsonLindenstrauss transformations. The new bounds allow obtaining the well known Fast Johnson-Lindenstrauss Transform for finite sets of bounded card... |

36 | On variants of the Johnson–Lindenstrauss lemma - Matousek - 2006 |

29 | A fast randomized algorithm for the approximation of matrices
- Woolfe, Liberty, et al.
(Show Context)
Citation Context ... dependence in the parameters n, k, N and δ. Applications for such transformations were found e.g. in designing fast approximation algorithms for solving large scale linear algebraic operations (e.g. =-=[6, 7]-=-). Although random projection and compressed sensing have much in common, they have mostly progressed in parallel. Here we combine recent work on bounds for sparse reconstruction to improve bounds of ... |

15 | A derandomized sparse Johnson-Lindenstrauss transform. arXiv preprint arXiv:1006.3585
- Kane, Nelson
- 2010
(Show Context)
Citation Context ...e that with probability at least 0.98, uniformly for all y ∈ Y , 1 k ‖ΦDyb‖ 2 = 1 k as required. ‖ΦDˇyb‖ 2 + 1 k ‖ΦDˆyb‖ 2 + 2b t D y H Φ t ΦDˇyb = ‖y‖ 2 + O(δ) , 4 A note on running time In [11] and =-=[20]-=- the authors present random operators which try to minimize the application time for sparse vectors. This is an important line of research given the increasing popularity of random projections for onl... |

9 |
T.: A Sparse Johnson–Lindenstrauss Transform
- Dasgupta, Kumar, et al.
- 2010
(Show Context)
Citation Context ...e conclude that with probability at least 0.98, uniformly for all y ∈ Y , 1 k ‖ΦDyb‖ 2 = 1 k as required. ‖ΦDˇyb‖ 2 + 1 k ‖ΦDˆyb‖ 2 + 2b t D y H Φ t ΦDˇyb = ‖y‖ 2 + O(δ) , 4 A note on running time In =-=[11]-=- and [20] the authors present random operators which try to minimize the application time for sparse vectors. This is an important line of research given the increasing popularity of random projection... |

8 | Problems and results in extremal combinatorics—I, Discrete Math
- Alon
- 2003
(Show Context)
Citation Context ...me synonymous with the process of approximate metric preserving dimension reduction using randomized linear mappings. However, these linear a suitable size [1][2][3][4] result in optimal construction =-=[5]-=- in the parameters n (the original dimension), k (the target dimension), N (the number of input vectors) and δ (the distortion). However, these constructions’ resulting running time complexity, measur... |

7 | Faster dimension reduction - Ailon, Chazelle - 2010 |

3 |
Nir Ailon, and Amit Singer. Dense fast random projections and lean walsh transforms
- Liberty
- 2008
(Show Context)
Citation Context ...ta et al.’s work [11] on construction of Johnson-Linenstrauss random matrices which can be more efficiently applied to sparse vectors, with applications in the streaming model, and Ailon et al’s work =-=[12]-=- on design of Johnson-Lindenstrauss matrices that run in linear time under certain assumptions on various norms of the input vectors. 3 mappings need not be (and indeed are usually not) projections in... |