Results 1 
9 of
9
On the Impossibility of Dimension Reduction in l_1
 In Proc. 35th Annu. ACM Sympos. Theory Comput
, 2003
"... The JohnsonLindenstrauss Lemma shows that any n points in Euclidean space (with distances measured by the L2 norm) may be mapped down to O((log n)/ep^2) dimensions such that no pairwise distance is distorted by more than a (1 ep) factor. Determining whether such dimension reduction is possible in L ..."
Abstract

Cited by 43 (1 self)
 Add to MetaCart
The JohnsonLindenstrauss Lemma shows that any n points in Euclidean space (with distances measured by the L2 norm) may be mapped down to O((log n)/ep^2) dimensions such that no pairwise distance is distorted by more than a (1 ep) factor. Determining whether such dimension reduction is possible in L1 has been an intriguing open question. Charikar and Sahai [7] recently showed lower bounds for dimension reduction in L1 that can be achieved by linear projections, and positive results for shortest path metrics of restricted graph families. However the question of general dimension reduction in L1 was still open. For example, it was not known whether it is possible to reduce the number of dimensions to O(log n) with 1 ep distortion. We show strong lower bounds for general dimension reduction in L1. We give an explicity family of n points in L1 such that any embedding with distortion d requires n^Omega(1/d^2) dimensions. This proves that there is no analog of the JohnsonLindenstrauss Lemma for L1
On Approximate Nearest Neighbors in NonEuclidean Spaces
 In FOCS
, 1998
"... The nearest neighbor search (NNS) problem is the following: Given a set of n points P = fp 1 ; : : : ; png in some metric space X, preprocess P so as to efficiently answer queries which require finding a point in P closest to a query point q 2 X. The approximate nearest neighbor search (cNNS) is a ..."
Abstract

Cited by 33 (8 self)
 Add to MetaCart
The nearest neighbor search (NNS) problem is the following: Given a set of n points P = fp 1 ; : : : ; png in some metric space X, preprocess P so as to efficiently answer queries which require finding a point in P closest to a query point q 2 X. The approximate nearest neighbor search (cNNS) is a relaxation of NNS which allows to return any point within c times the distance to the nearest neighbor (called cnearest neighbor). This problem is of major and growing importance to a variety of applications. In this paper, we give an algorithm for (4dlog 1+ae log 4de + 3)NNS algorithm in l d 1 with O(dn 1+ae log n) storage and O(d log n) query time. In particular, this yields the first algorithm for O(1)NNS for l 1 with subexponential storage. The preprocessing time is close to linear in the size of the data structure. The algorithm can be also used (after simple modifications) to output the exact nearest neighbor in time bounded by O(d log n) plus the number of (4dlog 1+ae log 4d...
On embedding trees into uniformly convex Banach spaces
, 1999
"... We investigate the minimum value of D = D(n) such that any npoint tree metric space (T ; ae) can be Dembedded into a given Banach space (X; k:k), that is, there exists a mapping f : T ! X with 1 D ae(x; y) kf(x) \Gamma f(y)k ae(x; y) for any x; y 2 T . Bourgain showed that D(n) grows to infin ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
We investigate the minimum value of D = D(n) such that any npoint tree metric space (T ; ae) can be Dembedded into a given Banach space (X; k:k), that is, there exists a mapping f : T ! X with 1 D ae(x; y) kf(x) \Gamma f(y)k ae(x; y) for any x; y 2 T . Bourgain showed that D(n) grows to infinity for any superreflexive X (and this characterizes superreflexivity), and for X = ` p , 1 ! p ! 1, he proved a quantitative lower bound of const \Delta (log log n) min(1=2;1=p) . We give another, completely elementary proof of this lower bound, and we prove that it is tight (up to the value of the constant). In particular, we show that any npoint tree metric space can be Dembedded into a Euclidean space, with no restriction on the dimension, with D = O( p log log n). 1 Introduction Let M be a metric space with metric ae, let X be a normed space with norm k:k, and let f : M ! X be a mapping. We say that f is a Dembedding , D 1 a real number, if we have 1 D ae(x; y) kf(x) \Gamm...
On the impossibility of dimension reduction in ℓ1
 In Proceedings of the 44th Annual IEEE Conference on Foundations of Computer Science
, 2003
"... The JohnsonLindenstrauss Lemma shows that any n points in Euclidean space (with distances measured by the ℓ2 norm) may be mapped down to O((log n)/ε 2) dimensions such that no pairwise distance is distorted by more than a (1+ε) factor. Determining whether such dimension reduction is possible in ℓ1 ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
The JohnsonLindenstrauss Lemma shows that any n points in Euclidean space (with distances measured by the ℓ2 norm) may be mapped down to O((log n)/ε 2) dimensions such that no pairwise distance is distorted by more than a (1+ε) factor. Determining whether such dimension reduction is possible in ℓ1 has been an intriguing open question. We show strong lower bounds for general dimension reduction in ℓ1. We give an explicit family of n points in ℓ1 such that any embedding with distortion δ requires n Ω(1/δ2) dimensions. This proves that there is no analog of the JohnsonLindenstrauss Lemma for ℓ1; in fact embedding with any constant distortion requires n Ω(1) dimensions. Further, embedding the points into ℓ1 with 1 + ε distortion requires n 1
Embedding tree metrics into low dimensional Euclidean spaces
 Discrete Comput. Geom
, 2000
"... We consider the question of embedding metrics induced by trees into Euclidean spaces with a restricted number of dimensions. We show that any weighted tree T with n(T ) vertices and l(T ) leaves can be embedded into d dimensional Euclidean space with ~ O(l(T ) 1=(d\Gamma1) ) distortion. Further, ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
We consider the question of embedding metrics induced by trees into Euclidean spaces with a restricted number of dimensions. We show that any weighted tree T with n(T ) vertices and l(T ) leaves can be embedded into d dimensional Euclidean space with ~ O(l(T ) 1=(d\Gamma1) ) distortion. Further, we exhibit an embedding with almost the same distortion which can be computed efficiently. This distortion substantially improves the previous best upper bound of ~ O(n(T ) 2=d ) and almost matches the best known lower bound of \Omega\Gamma l(T ) 1=d ). 1 Introduction The study of finite metric spaces has been the focus of much attention in recent years due both to its mathematical and to its computational significance. Perhaps the most widely studied class of questions has been that of embeddability of a finite metric space into some target space with the aim of preserving the distances between points in the metric. Often the distances cannot be preserved exactly, and then we aim to k...
A SketchBased Distance Oracle for WebScale Graphs
"... We study the fundamental problem of computing distances between nodes in large graphs such as the web graph and social networks. Our objective is to be able to answer distance queries between pairs of nodes in real time. Since the standard shortest path algorithms are expensive, our approach moves t ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
We study the fundamental problem of computing distances between nodes in large graphs such as the web graph and social networks. Our objective is to be able to answer distance queries between pairs of nodes in real time. Since the standard shortest path algorithms are expensive, our approach moves the timeconsuming shortestpath computation offline, and at query time only looks up precomputed values and performs simple and fast computations on these precomputed values. More specifically, during the offline phase we compute and store a small “sketch ” for each node in the graph, and at querytime we look up the sketches of the source and destination nodes and perform a simple computation using these two sketches to estimate the distance. Categories and Subject Descriptors G.2.2 [Graph Theory]: Graph algorithms, path and circuit problems
Ramsey partitions and proximity data structures
 J. European Math. Soc 9
"... This paper addresses two problems lying at the intersection of geometric analysis and theoretical computer science: The nonlinear isomorphic Dvoretzky theorem and the design of good approximate distance oracles for large distortion. We introduce the notion of Ramsey partitions of a finite metric sp ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
This paper addresses two problems lying at the intersection of geometric analysis and theoretical computer science: The nonlinear isomorphic Dvoretzky theorem and the design of good approximate distance oracles for large distortion. We introduce the notion of Ramsey partitions of a finite metric space, and show that the existence of good Ramsey partitions implies a solution to the metric Ramsey problem for large distortion (a.k.a. the nonlinear version of the isomorphic Dvoretzky theorem, as introduced by Bourgain, Figiel, and Milman in [8]). We then proceed to construct optimal Ramsey partitions, and use them to show that for everyε∈(0, 1), any npoint metric space has a subset of size n 1−ε which embeds into Hilbert space with distortion O(1/ε). This result is best possible and improves part of the metric Ramsey theorem of Bartal, Linial, Mendel and Naor [5], in addition to considerably simplifying its proof. We use our new Ramsey partitions to design the best known approximate distance oracles when the distortion is large, closing a gap left open by Thorup and Zwick in [31]. Namely, we show that for any n point metric space X, and k≥1, there exists an O(k)approximate distance oracle whose storage requirement is O ( n 1+1/k) , and whose query time is a universal constant. We also discuss applications of Ramsey partitions to various other geometric data structure problems, such as the design of efficient data structures for approximate ranking.
Dimension Reduction in the l1 norm
 in The 43th Annual Symposium on Foundations of Computer Science (FOCS'02
, 2002
"... The JohnsonLindenstrauss Lemma shows that any set of n points in Euclidean space can be mapped linearly down to ) dimensions such that all pairwise distances are distorted by at most 1 + #. We study the following basic question: Does there exist an analogue of the JohnsonLindenstrauss Lemma for t ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
The JohnsonLindenstrauss Lemma shows that any set of n points in Euclidean space can be mapped linearly down to ) dimensions such that all pairwise distances are distorted by at most 1 + #. We study the following basic question: Does there exist an analogue of the JohnsonLindenstrauss Lemma for the # 1 norm? Note that JohnsonLindenstrauss Lemma gives a linear embedding which is independent of the point set. For the # 1 norm, we show that one cannot hope to use linear embeddings as a dimensionality reduction tool for general point sets, even if the linear embedding is chosen as a function of the given point set. In particular, we construct a set of 1 such that any linear embedding into # must incur a distortion of # n/d). This bound is tight up to a log n factor. We then initiate a systematic study of general classes of # 1 embeddable metrics that admit low dimensional, small distortion embeddings. In particular, we show dimensionality reduction theorems for tree metrics, circulardecomposable metrics, and metrics supported on K 2,3 free graphs, giving embeddings into # 1 with constant distortion. Finally, we also present lower bounds on dimension reduction techniques for other # p norms.
unknown title
"... 1 Introduction This lecture series will focus primarily on finite metric spaces. Definition 1. A metric space is a pair (X; d) where X is a nonempty set and d: X \Theta X! R+ is a map satisfying,for all ..."
Abstract
 Add to MetaCart
1 Introduction This lecture series will focus primarily on finite metric spaces. Definition 1. A metric space is a pair (X; d) where X is a nonempty set and d: X \Theta X! R+ is a map satisfying,for all