Results 1  10
of
49
On Indexing Mobile Objects
, 1999
"... We show how to index mobile objects in one and two dimensions using efficient dynamic external memory data structures. The problem is motivated by real life applications in traffic monitoring, intelligent navigation and mobile communications domains. For the 1dimensional case, we give (i) a dynamic ..."
Abstract

Cited by 202 (14 self)
 Add to MetaCart
We show how to index mobile objects in one and two dimensions using efficient dynamic external memory data structures. The problem is motivated by real life applications in traffic monitoring, intelligent navigation and mobile communications domains. For the 1dimensional case, we give (i) a dynamic, external memory algorithm with guaranteed worst case performance and linear space and (ii) a practical approximation algorithm also in the dynamic, external memory setting, which has linear space and expected logarithmic query time. We also give an algorithm with guaranteed logarithmic query time for a restricted version of the problem. We present extensions of our techniques to two dimensions. In addition we give a lower bound on the number of I/O's needed to answer the ddimensional problem. Initial experimental results and comparisons to traditional indexing approaches are also included. 1 Introduction Traditional database management systems assume that data stored in the database rem...
Efficient Indexing Methods for Probabilistic Threshold Queries over Uncertain Data
 Proc. 30th Int’l Conf. Very Large Data Bases (VLDB
, 2004
"... It is infeasible for a sensor database to contain the exact value of each sensor at all points in time. This uncertainty is inherent in these systems due to measurement and sampling errors, and resource limitations. In order to avoid drawing erroneous conclusions based upon stale data, the use of un ..."
Abstract

Cited by 106 (20 self)
 Add to MetaCart
It is infeasible for a sensor database to contain the exact value of each sensor at all points in time. This uncertainty is inherent in these systems due to measurement and sampling errors, and resource limitations. In order to avoid drawing erroneous conclusions based upon stale data, the use of uncertainty intervals that model each data item as a range and associated probability density function (pdf) rather than a single value has recently been proposed. Querying these uncertain data introduces imprecision into answers, in the form of probability values that specify the likeliness the answer satisfies the query. These queries are more expensive to evaluate than their traditional counterparts but are guaranteed to be correct and more informative due to the probabilities accompanying the answers. Although the answer probabilities are useful, for many applications, it is only necessary to know whether the probability exceeds a given threshold – we term these Probabilistic Threshold Queries (PTQ). In this paper we address the efficient computation of these types of queries. In particular, we develop two index structures and associated algorithms to efficiently answer PTQs. The first index scheme is based on the idea of augmenting uncertainty information to an Rtree. We establish the difficulty
Arrangements and Their Applications
 Handbook of Computational Geometry
, 1998
"... The arrangement of a finite collection of geometric objects is the decomposition of the space into connected cells induced by them. We survey combinatorial and algorithmic properties of arrangements of arcs in the plane and of surface patches in higher dimensions. We present many applications of arr ..."
Abstract

Cited by 78 (20 self)
 Add to MetaCart
The arrangement of a finite collection of geometric objects is the decomposition of the space into connected cells induced by them. We survey combinatorial and algorithmic properties of arrangements of arcs in the plane and of surface patches in higher dimensions. We present many applications of arrangements to problems in motion planning, visualization, range searching, molecular modeling, and geometric optimization. Some results involving planar arrangements of arcs have been presented in a companion chapter in this book, and are extended in this chapter to higher dimensions. Work by P.A. was supported by Army Research Office MURI grant DAAH049610013, by a Sloan fellowship, by an NYI award, and by a grant from the U.S.Israeli Binational Science Foundation. Work by M.S. was supported by NSF Grants CCR9122103 and CCR9311127, by a MaxPlanck Research Award, and by grants from the U.S.Israeli Binational Science Foundation, the Israel Science Fund administered by the Israeli Ac...
Range Searching
, 1996
"... Range searching is one of the central problems in computational geometry, because it arises in many applications and a wide variety of geometric problems can be formulated as a rangesearching problem. A typical rangesearching problem has the following form. Let S be a set of n points in R d , an ..."
Abstract

Cited by 70 (1 self)
 Add to MetaCart
Range searching is one of the central problems in computational geometry, because it arises in many applications and a wide variety of geometric problems can be formulated as a rangesearching problem. A typical rangesearching problem has the following form. Let S be a set of n points in R d , and let R be a family of subsets; elements of R are called ranges . We wish to preprocess S into a data structure so that for a query range R, the points in S " R can be reported or counted efficiently. Typical examples of ranges include rectangles, halfspaces, simplices, and balls. If we are only interested in answering a single query, it can be done in linear time, using linear space, by simply checking for each point p 2 S whether p lies in the query range.
Efficient Searching with Linear Constraints (Extended Abstract)
"... ) Pankaj K. Agarwal Lars Arge y Jeff Erickson z Paolo G. Franciosa x Jeffrey Scott Vitter  Abstract We show how to preprocess a set S of points in R d to get an external memory data structure that efficiently supports linearconstraint queries. Each query is in the form of a linear c ..."
Abstract

Cited by 56 (17 self)
 Add to MetaCart
) Pankaj K. Agarwal Lars Arge y Jeff Erickson z Paolo G. Franciosa x Jeffrey Scott Vitter  Abstract We show how to preprocess a set S of points in R d to get an external memory data structure that efficiently supports linearconstraint queries. Each query is in the form of a linear constraint a \Delta x b; the data structure must report all the points of S that satisfy the query. Our goal is to minimize the number of disk blocks required to store the data structure and the number of disk accesses (I/Os) required to answer a query. For d = 2, we present the first nearlinear size data structures that can answer linearconstraint queries using an optimal number of I/Os. We also present a linearsize data structure that can answer queries efficiently in the worst case. We combine these two approaches to obtain tradeoffs between space and query time. Finally, we show that some of our techniques extend to higher dimensions d. Center for Geometric Computing, Computer...
Efficient Aggregation over Objects with Extent (Extended Abstract)
 TechReport UCR CS 01 01, CS Dept
, 2002
"... We examine the problem of efficiently computing sum/count/avg aggregates over... ..."
Abstract

Cited by 33 (8 self)
 Add to MetaCart
We examine the problem of efficiently computing sum/count/avg aggregates over...
Random Sampling, Halfspace Range Reporting, and Construction of (≤k)Levels in Three Dimensions
 SIAM J. COMPUT
, 1999
"... Given n points in three dimensions, we show how to answer halfspace range reporting queries in O(logn+k) expected time for an output size k. Our data structure can be preprocessed in optimal O(n log n) expected time. We apply this result to obtain the first optimal randomized algorithm for the co ..."
Abstract

Cited by 32 (7 self)
 Add to MetaCart
Given n points in three dimensions, we show how to answer halfspace range reporting queries in O(logn+k) expected time for an output size k. Our data structure can be preprocessed in optimal O(n log n) expected time. We apply this result to obtain the first optimal randomized algorithm for the construction of the ( k)level in an arrangement of n planes in three dimensions. The algorithm runs in O(n log n+nk²) expected time. Our techniques are based on random sampling. Applications in two dimensions include an improved data structure for "k nearest neighbors" queries, and an algorithm that constructs the orderk Voronoi diagram in O(n log n + nk log k) expected time.
Using the Triangle Inequality to Reduce the Number of Comparisons Required for SimilarityBased Retrieval
 Proc. of SPIE/IS&T Conf. on Storage and Retrieval for Image and Video Databases IV
, 1996
"... Dissimilarity measures, the basis of similaritybased retrieval, can be viewed as a distance and a similaritybased search as a nearest neighbor search. Though there has been extensive research on data structures and search methods to support nearestneighbor searching, these indexing and dimensionr ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
Dissimilarity measures, the basis of similaritybased retrieval, can be viewed as a distance and a similaritybased search as a nearest neighbor search. Though there has been extensive research on data structures and search methods to support nearestneighbor searching, these indexing and dimensionreduction methods are generally not applicable to noncoordinate data and nonEuclidean distance measures. In this paper we reexamine and extend previous work of other researchers on best match searching based on the triangle inequality. These methods can be used to organize both noncoordinate data and nonEuclidean metric similarity measures. The effectiveness of the indexes depends on the actual dimensionality of the feature set, data, and similarity metric used. We show that these methods provide significant performance improvements and may be of practical value in realworld databases. Keywords: image database indexing, similaritybased retrieval, best match searching, triangle inequali...
Regular and NonRegular Point Sets: Properties and Reconstruction
 IN "COMPUTATIONAL GEOMETRY  THEORY AND APPLICATION"
"... In this paper, we address the problem of curve and surface reconstruction from sets of points. We introduce regular interpolants, which are polygonal approximations of curves and surfaces satisfying a new regularity condition. This new condition, which is an extension of the popular notion ofsampli ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
In this paper, we address the problem of curve and surface reconstruction from sets of points. We introduce regular interpolants, which are polygonal approximations of curves and surfaces satisfying a new regularity condition. This new condition, which is an extension of the popular notion ofsampling to the practical case of discrete shapes, seems much more realistic than previously proposed conditions based on properties of the underlying continuous shapes. Indeed, contrary to previous sampling criteria, our regularity condition can be checked on the basis of the samples alone and can be turned into a provably correct curve and surface reconstruction algorithm. Our reconstruction methods can also be applied to nonregular and unorganized point sets, revealing a larger part of the inner structure of such point sets than past approaches. Several realsize reconstruction examples validate the new method.