Results 1  10
of
1,254
PolylogTime and NearLinear Work Approximation Scheme for Undirected Shortest Paths
 Journal of the ACM
, 2000
"... 1 Shortest paths computations constitute one of the most fundamental network problems. Nonetheless, known parallel shortestpaths algorithms are generally inefficient: they perform significantly more work (product of time and processors) than their sequential counterparts. This gap, known in the lit ..."
Abstract

Cited by 38 (1 self)
 Add to MetaCart
in the literature as the “transitive closure bottleneck, ” poses a longstanding open problem. Our main result is an O(mn ɛ0 + s(m + n 1+ɛ0)) work polylogtime randomized algorithm that computes paths within (1 + O(1 / polylog n)) of shortest from s source nodes to all other nodes in weighted undirected networks
unknown title
, 1994
"... Polylogtime and nearlinear work approximation scheme for undirected shortest paths ..."
Abstract
 Add to MetaCart
Polylogtime and nearlinear work approximation scheme for undirected shortest paths
A LinearProcessor PolylogTime Algorithm for Shortest Paths in Planar Graphs
, 1993
"... We give an algorithm requiring polylog time and a linear number of processors to solve singlesource shortest paths in directed planar graphs, boundedgenus graphs, and 2dimensional overlap graphs. More generally, the algorithm works for any graph provided with a decomposition tree constructed using ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
We give an algorithm requiring polylog time and a linear number of processors to solve singlesource shortest paths in directed planar graphs, boundedgenus graphs, and 2dimensional overlap graphs. More generally, the algorithm works for any graph provided with a decomposition tree constructed
Minimum Cuts in NearLinear Time
, 1999
"... We significantly improve known time bounds for solving the minimum cut problem on undirected graphs. We use a "semiduality" between minimum cuts and maximum spanning tree packings combined with our previously developed random sampling techniques. We give a randomized (Monte Carlo) algorit ..."
Abstract

Cited by 96 (12 self)
 Add to MetaCart
We significantly improve known time bounds for solving the minimum cut problem on undirected graphs. We use a "semiduality" between minimum cuts and maximum spanning tree packings combined with our previously developed random sampling techniques. We give a randomized (Monte Carlo
Polynomial time approximation schemes for Euclidean TSP and other geometric problems
 In Proceedings of the 37th IEEE Symposium on Foundations of Computer Science (FOCS’96
, 1996
"... Abstract. We present a polynomial time approximation scheme for Euclidean TSP in fixed dimensions. For every fixed c � 1 and given any n nodes in � 2, a randomized version of the scheme finds a (1 � 1/c)approximation to the optimum traveling salesman tour in O(n(log n) O(c) ) time. When the nodes a ..."
Abstract

Cited by 399 (3 self)
 Add to MetaCart
are in � d, the running time increases to O(n(log n) (O(�dc))d�1). For every fixed c, d the running time is n � poly(log n), that is nearly linear in n. The algorithm can be derandomized, but this increases the running time by a factor O(n d). The previous best approximation algorithm for the problem (due
Compressive sampling
, 2006
"... Conventional wisdom and common practice in acquisition and reconstruction of images from frequency data follow the basic principle of the Nyquist density sampling theory. This principle states that to reconstruct an image, the number of Fourier samples we need to acquire must match the desired res ..."
Abstract

Cited by 1427 (15 self)
 Add to MetaCart
Conventional wisdom and common practice in acquisition and reconstruction of images from frequency data follow the basic principle of the Nyquist density sampling theory. This principle states that to reconstruct an image, the number of Fourier samples we need to acquire must match the desired resolution of the image, i.e. the number of pixels in the image. This paper surveys an emerging theory which goes by the name of “compressive sampling” or “compressed sensing,” and which says that this conventional wisdom is inaccurate. Perhaps surprisingly, it is possible to reconstruct images or signals of scientific interest accurately and sometimes even exactly from a number of samples which is far smaller than the desired resolution of the image/signal, e.g. the number of pixels in the image. It is believed that compressive sampling has far reaching implications. For example, it suggests the possibility of new data acquisition protocols that translate analog information into digital form with fewer sensors than what was considered necessary. This new sampling theory may come to underlie procedures for sampling and compressing data simultaneously. In this short survey, we provide some of the key mathematical insights underlying this new theory, and explain some of the interactions between compressive sampling and other fields such as statistics, information theory, coding theory, and theoretical computer science.
Nearlinear time approximation algorithms for curve simplification
 Proc. of the 10th European Symposium on Algorithms, 2002
, 2002
"... Abstract We consider the problem of approximating a polygonal curve P under a given error criterionby another polygonal curve P 0 whose vertices are a subset of the vertices of P. The goal is tominimize the number of vertices of P 0 while ensuring that the error between P 0 and P is belowa certain t ..."
Abstract

Cited by 65 (8 self)
 Add to MetaCart
threshold. We consider two different error measures: Hausdorff and Fr'echet. For both error criteria, we present nearlinear time approximation algorithms that, given a parameter " ? 0, compute a simplified polygonal curve P 0 whose error is less than " and size at most the sizeof
Searching in metric spaces
, 2001
"... The problem of searching the elements of a set that are close to a given query element under some similarity criterion has a vast number of applications in many branches of computer science, from pattern recognition to textual and multimedia information retrieval. We are interested in the rather gen ..."
Abstract

Cited by 432 (38 self)
 Add to MetaCart
general case where the similarity criterion defines a metric space, instead of the more restricted case of a vector space. Many solutions have been proposed in different areas, in many cases without crossknowledge. Because of this, the same ideas have been reconceived several times, and very different
A NearLinear Time εApproximation Algorithm for Bipartite Geometric Matching
, 2011
"... For point sets A, B ∈ R d, A  = B  = n, and for a parameter ε> 0, we present an algorithm that computes, in O(npoly(log n, 1/ε)) time, a matching whose cost is within (1 + ε) of optimal perfect matching with high probability; the previously best known algorithm takes Ω(n 3/2) time. We appro ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
approximate the Lp norm using a distance function, d(·, ·) based on a randomly shifted quadtree. Our algorithm iteratively generates an approximate minimumcost augmenting path under d(·, ·) in time proportional to the length of the path. We show that the total length of the augmenting paths generated
Results 1  10
of
1,254