Results 1  10
of
27
Kinetic Data Structures  A State of the Art Report
, 1998
"... ... In this paper we present a general framework for addressing such problems and the tools for designing and analyzing relevant algorithms, which we call kinetic data structures. We discuss kinetic data structures for a variety of fundamental geometric problems, such as the maintenance of convex hu ..."
Abstract

Cited by 92 (27 self)
 Add to MetaCart
... In this paper we present a general framework for addressing such problems and the tools for designing and analyzing relevant algorithms, which we call kinetic data structures. We discuss kinetic data structures for a variety of fundamental geometric problems, such as the maintenance of convex hulls, Voronoi and Delaunay diagrams, closest pairs, and intersection and visibility problems. We also briefly address the issues that arise in implementing such structures robustly and efficiently. The resulting techniques satisfy three desirable properties: (1) they exploit the continuity of the motion of the objects to gain efficiency, (2) the number of events processed by the algorithms is close to the minimum necessary in the worst case, and (3) any object may change its `flight plan' at any moment with a low cost update to the simulation data structures. For computer applications dealing with motion in the physical world, kinetic data structures lead to simulation performance unattainable by other means. In addition, they raise fundamentally new combinatorial and algorithmic questions whose study may prove fruitful for other disciplines as well.
Geometric Applications of a Randomized Optimization Technique
 Discrete Comput. Geom
, 1999
"... We propose a simple, general, randomized technique to reduce certain geometric optimization problems to their corresponding decision problems. These reductions increase the expected time complexity by only a constant factor and eliminate extra logarithmic factors in previous, often more complicated, ..."
Abstract

Cited by 53 (6 self)
 Add to MetaCart
We propose a simple, general, randomized technique to reduce certain geometric optimization problems to their corresponding decision problems. These reductions increase the expected time complexity by only a constant factor and eliminate extra logarithmic factors in previous, often more complicated, deterministic approaches (such as parametric searching). Faster algorithms are thus obtained for a variety of problems in computational geometry: finding minimal kpoint subsets, matching point sets under translation, computing rectilinear pcenters and discrete 1centers, and solving linear programs with k violations. 1 Introduction Consider the classic randomized algorithm for finding the minimum of r numbers minfA[1]; : : : ; A[r]g: Algorithm randmin 1. randomly pick a permutation hi 1 ; : : : ; i r i of h1; : : : ; ri 2. t /1 3. for k = 1; : : : ; r do 4. if A[i k ] ! t then 5. t / A[i k ] 6. return t By a wellknown fact [27, 44], the expected number of times that step 5 is execut...
Geometric Range Searching
, 1994
"... In geometric range searching, algorithmic problems of the following type are considered: Given an npoint set P in the plane, build a data structure so that, given a query triangle R, the number of points of P lying in R can be determined quickly. Problems of this type are of crucial importance in c ..."
Abstract

Cited by 46 (2 self)
 Add to MetaCart
In geometric range searching, algorithmic problems of the following type are considered: Given an npoint set P in the plane, build a data structure so that, given a query triangle R, the number of points of P lying in R can be determined quickly. Problems of this type are of crucial importance in computational geometry, as they can be used as subroutines in many seemingly unrelated algorithms. We present a survey of results and main techniques in this area.
Parallel Algorithms for HigherDimensional Convex Hulls
"... We give fast randomized and deterministic parallel methods for constructing convex hulls in R^d, for any fixed d. Our methods are for the weakest sharedmemory model,the EREW PRAM, and have optimal work bounds (with high probability for the randomized methods). In particular, we show that the convex ..."
Abstract

Cited by 44 (14 self)
 Add to MetaCart
We give fast randomized and deterministic parallel methods for constructing convex hulls in R^d, for any fixed d. Our methods are for the weakest sharedmemory model,the EREW PRAM, and have optimal work bounds (with high probability for the randomized methods). In particular, we show that the convex hull of n points in R^d can be constructed in O(log n) time using O(n log n + nbd=2c) work, with high probability. We also show that it can be constructed deterministically in O(log² n) time using O(n log n) work for d = 3 and in O(log n) time using O(nbd=2c logc(dd=2e\Gamma bd=2c) n) work, for d * 4, where c? 0is a constant, which is optimal for even d * 4. We also showhow to make our 3dimensional methods outputsensitive with only a small increase in running time.These methods can be applied to other problems as well. A variation of the convex hull algorithm for even dimensions deterministically constructs a (1=r)cutting of n hyperplanes in IR d in O(log n) time using optimal O(nrd\Gamma 1) work; when r = n, we obtain their arrangement and a pointlocation data structure for it. With appropriate modifications, our deterministic 3dimensional convex hull algorithmcan be used to compute, in the same resource bounds, the intersection of n balls of equal radius in R³. This leads to asequential algorithm for computing the diameter of a point set in IR3 with running time O(n log³ n), which is arguably simpler than an algorithm with the same running time by Brönnimann et al.
Extracting Skeletal Curves from 3D Scattered Data
, 1997
"... : We introduce a method for the construction of skeletal curves from an unorganized collection of scattered data points lying on a surface. These curves may have a tree like structure to capture branching shapes such as blood vessels. The skeletal curves can be used for different applications rangin ..."
Abstract

Cited by 38 (0 self)
 Add to MetaCart
: We introduce a method for the construction of skeletal curves from an unorganized collection of scattered data points lying on a surface. These curves may have a tree like structure to capture branching shapes such as blood vessels. The skeletal curves can be used for different applications ranging from surface reconstruction to object recognition. As an input, the algorithm takes a set of 3D points. It returns a set of curves arranged in a tree structure. The only interaction needed is the selection of a data point which represent the root of the tree. A neighborhood graph is constructed over the set of points to compute geodesic distances between the root point and the other points. Connected level sets of the distance map are then extracted and organized in a tree structure. The centers of these levels sets constitute the skeletal curves. Keywords: visualization, skeletal curve, cylindrical decomposition, generalized cylinders, reconstruction (R'esum'e : tsvp) Anne.Verroust@in...
New Lower Bounds for Hopcroft's Problem
, 1996
"... We establish new lower bounds on the complexity of the following basic geometric problem, attributed to John Hopcroft: Given a set of n points and m hyperplanes in R d , is any point contained in any hyperplane? We define a general class of partitioning algorithms, and show that in the worst cas ..."
Abstract

Cited by 33 (6 self)
 Add to MetaCart
We establish new lower bounds on the complexity of the following basic geometric problem, attributed to John Hopcroft: Given a set of n points and m hyperplanes in R d , is any point contained in any hyperplane? We define a general class of partitioning algorithms, and show that in the worst case, for all m and n, any such algorithm requires time #(n log m+n 2/3 m 2/3 +m log n) in two dimensions, or #(n log m+n 5/6 m 1/2 +n 1/2 m 5/6 + m log n) in three or more dimensions. We obtain slightly higher bounds for the counting version of Hopcroft's problem in four or more dimensions. Our planar lower bound is within a factor of 2 O(log # (n+m)) of the best known upper bound, due to Matousek. Previously, the best known lower bound, in any dimension, was #(n log m + m log n). We develop our lower bounds in two stages. First we define a combinatorial representation of the relative order type of a set of points and hyperplanes, called a monochromatic cover, and derive low...
Almost tight upper bounds for vertical decompositions in four dimensions
 In Proc. 42nd IEEE Symposium on Foundations of Computer Science
, 2001
"... We show that the complexity of the vertical decomposition of an arrangement of n fixeddegree algebraic surfaces or surface patches in four dimensions is O(n 4+ε), for any ε> 0. This improves the best previously known upper bound for this problem by a nearlinear factor, and settles a major problem i ..."
Abstract

Cited by 31 (6 self)
 Add to MetaCart
We show that the complexity of the vertical decomposition of an arrangement of n fixeddegree algebraic surfaces or surface patches in four dimensions is O(n 4+ε), for any ε> 0. This improves the best previously known upper bound for this problem by a nearlinear factor, and settles a major problem in the theory of arrangements of surfaces, open since 1989. The new bound can be extended to higher dimensions, yielding the bound O(n 2d−4+ε), for any ε> 0, on the complexity of vertical decompositions in dimensions d ≥ 4. We also describe the immediate algorithmic applications of these results, which include improved algorithms for point location, range searching, ray shooting, robot motion planning, and some geometric optimization problems. 1
Penetration Depth of Two Convex Polytopes in 3D
 Nordic J. Computing
, 2000
"... with m and n facets, respectively. The penetration depth of A and B, denoted as (A; B), is the minimum distance by which A has to be translated so that A and B do not intersect. We present a randomized algorithm that computes (A; B) in O(m + m ) expected time, for any constant " > 0. I ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
with m and n facets, respectively. The penetration depth of A and B, denoted as (A; B), is the minimum distance by which A has to be translated so that A and B do not intersect. We present a randomized algorithm that computes (A; B) in O(m + m ) expected time, for any constant " > 0. It also computes a vector t such that ktk = (A; B) and int(A + t) \ B = ;. We show that if the Minkowski sum B ( A) has K facets, then the expected running time of our algorithm is O K , for any " > 0.
Fast algorithms for collision and proximity problems involving moving geometric objects
 Comput. Geom. Theory Appl
, 1996
"... Consider a set of geometric objects, such as points, line segments, or axesparallel hyperrectangles in IR d, that move with constant but possibly different velocities along linear trajectories. Efficient algorithms are presented for several problems defined on such objects, such as determining wheth ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
Consider a set of geometric objects, such as points, line segments, or axesparallel hyperrectangles in IR d, that move with constant but possibly different velocities along linear trajectories. Efficient algorithms are presented for several problems defined on such objects, such as determining whether any two objects ever collide and computing the minimum interpoint separation or minimum diameter that ever occurs. In particular, two open problems from the literature are solved: Deciding in o(n 2) time if there is a collision in a set of n moving points in IR 2, where the points move at constant but possibly different velocities, and the analogous problem for detecting a redblue collision between sets of red and blue moving points. The strategy used involves reducing the given problem on moving objects to a different problem on a set of static objects, and then solving the latter problem using techniques based on sweeping, orthogonal range searching, simplex composition, and parametric search. 1
Derandomization in Computational Geometry
, 1996
"... We survey techniques for replacing randomized algorithms in computational geometry by deterministic ones with a similar asymptotic running time. 1 Randomized algorithms and derandomization A rapid growth of knowledge about randomized algorithms stimulates research in derandomization, that is, repla ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
We survey techniques for replacing randomized algorithms in computational geometry by deterministic ones with a similar asymptotic running time. 1 Randomized algorithms and derandomization A rapid growth of knowledge about randomized algorithms stimulates research in derandomization, that is, replacing randomized algorithms by deterministic ones with as small decrease of efficiency as possible. Related to the problem of derandomization is the question of reducing the amount of random bits needed by a randomized algorithm while retaining its efficiency; the derandomization can be viewed as an ultimate case. Randomized algorithms are also related to probabilistic proofs and constructions in combinatorics (which came first historically), whose development has similarly been accompanied by the effort to replace them by explicit, nonrandom constructions whenever possible. Derandomization of algorithms can be seen as a part of an effort to map the power of randomness and explain its role. ...