Results 1  10
of
15
Towards InPlace Geometric Algorithms and Data Structures
 In Proceedings of the Twentieth ACM Symposium on Computational Geometry
, 2003
"... For many geometric problems, there are ecient algorithms that surprisingly use very little extra space other than the given array holding the input. For many geometric query problems, there are ecient data structures that need no extra space at all other than an array holding a permutation of the ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
For many geometric problems, there are ecient algorithms that surprisingly use very little extra space other than the given array holding the input. For many geometric query problems, there are ecient data structures that need no extra space at all other than an array holding a permutation of the input. In this paper, we obtain the rst such spaceeconomical solutions for a number of fundamental problems, including threedimensional convex hulls, twodimensional Delaunay triangulations, xeddimensional range queries, and xeddimensional nearest neighbor queries.
Cacheoblivious algorithms and data structures
 IN SWAT
, 2004
"... Frigo, Leiserson, Prokop and Ramachandran in 1999 introduced the idealcache model as a formal model of computation for developing algorithms in environments with multiple levels of caching, and coined the terminology of cacheoblivious algorithms. Cacheoblivious algorithms are described as stand ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
(Show Context)
Frigo, Leiserson, Prokop and Ramachandran in 1999 introduced the idealcache model as a formal model of computation for developing algorithms in environments with multiple levels of caching, and coined the terminology of cacheoblivious algorithms. Cacheoblivious algorithms are described as standard RAM algorithms with only one memory level, i.e. without any knowledge about memory hierarchies, but are analyzed in the twolevel I/O model of Aggarwal and Vitter for an arbitrary memory and block size and an optimal offline cache replacement strategy. The result are algorithms that automatically apply to multilevel memory hierarchies. This paper gives an overview of the results achieved on cacheoblivious algorithms and data structures since the seminal paper by Frigo et al.
A spaceefficient algorithm for segment intersection
 In Proc. 15th Canad. Conf. Comput. Geom
, 2003
"... We examine the space requirement for the classic linesegment intersection problem. Using socalled implicit data structures, we show how to make the standard sweepline algorithm run in O((n + k) log 2 n) time with only O(log 2 n) extra space, where n is the number of line segments and k is the numb ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We examine the space requirement for the classic linesegment intersection problem. Using socalled implicit data structures, we show how to make the standard sweepline algorithm run in O((n + k) log 2 n) time with only O(log 2 n) extra space, where n is the number of line segments and k is the number of intersections. If division is allowed and input can be destroyed, the algorithm can run in O((n + k) log n) time with O(1) extra space.
SpaceEfficient Algorithms for Klee’s Measure Problem
, 2005
"... We give spaceefficient geometric algorithms for three related problems. Given a set of n axisaligned rectangles in the plane, we calculate the area covered by the union of these rectangles (Klee’s measure problem) in O(n 3/2 log n) time with O(√n) extra space. If the input can be destroyed and the ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We give spaceefficient geometric algorithms for three related problems. Given a set of n axisaligned rectangles in the plane, we calculate the area covered by the union of these rectangles (Klee’s measure problem) in O(n 3/2 log n) time with O(√n) extra space. If the input can be destroyed and there are no degenerate cases and input coordinates are all integers, we can solve Klee’s measure problem in O(n log² n) time with O(log² n) extra space. Given a set of n points in the plane, we find the axisaligned unit square that covers the maximum number of points in O(n log³ n) time with O(log² n) extra space.
Putting your data structure on a diet
 In preparation (2006). [Ask Jyrki for details
, 2007
"... Abstract. Consider a data structure D that stores a dynamic collection of elements. Assume that D uses a linear number of words in addition to the elements stored. In this paper several datastructural transformations are described that can be used to transform D into another data structure D ′ that ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Consider a data structure D that stores a dynamic collection of elements. Assume that D uses a linear number of words in addition to the elements stored. In this paper several datastructural transformations are described that can be used to transform D into another data structure D ′ that supports the same operations as D, has considerably smaller memory overhead than D, and performs the supported operations by a small constant factor or a small additive term slower than D, depending on the data structure and operation in question. The compaction technique has been successfully applied for linked lists, dictionaries, and priority queues.
InPlace 2d Nearest Neighbor Search
, 2007
"... Abstract We revisit a classic problem in computational geometry: preprocessing a planar npoint set to answer nearest neighbor queries. In SoCG 2004, Br"onnimann, Chan, and Chen showed that it is possible to design an efficient data structure that takes no extra space at all other than the ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Abstract We revisit a classic problem in computational geometry: preprocessing a planar npoint set to answer nearest neighbor queries. In SoCG 2004, Br&quot;onnimann, Chan, and Chen showed that it is possible to design an efficient data structure that takes no extra space at all other than the input array holding a permutation of the points. The best query time known for such &quot;inplace data structures &quot; is O(log 2 n). In this paper, we break the O(log 2 n) barrier by providing a method that answers nearest neighbor queries in time O((log n) log3=2 2 log log n) = O(log
A CacheOblivious Implicit Dictionary with the Working Set Property
"... Abstract. In this paper we present an implicit dictionary with the working set property i.e. a dictionary supporting insert(e), delete(x) and predecessor(x) in O(log n) time and search(x) in O(log ℓ) time, where n is the number of elements stored in the dictionary and ℓ is the number of distinct ele ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we present an implicit dictionary with the working set property i.e. a dictionary supporting insert(e), delete(x) and predecessor(x) in O(log n) time and search(x) in O(log ℓ) time, where n is the number of elements stored in the dictionary and ℓ is the number of distinct elements searched for since the element with key x was last searched for. The dictionary stores the elements in an array of size n using no additional space. In the cacheoblivious model the operations insert(e), delete(x) and predecessor(x) cause O(log B n) cachemisses and search(x) causes O(log B ℓ) cachemisses. 1
A DistributionSensitive Dictionary with Low Space Overhead ⋆
"... Abstract. The time required for a sequence of operations on a data structure is usually measured in terms of the worst possible such sequence. This, however, is often an overestimate of the actual time required. Distributionsensitive data structures attempt to take advantage of underlying patterns ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The time required for a sequence of operations on a data structure is usually measured in terms of the worst possible such sequence. This, however, is often an overestimate of the actual time required. Distributionsensitive data structures attempt to take advantage of underlying patterns in a sequence of operations in order to reduce time complexity, since access patterns are nonrandom in many applications. Unfortunately, many of the distributionsensitive structures in the literature require a great deal of space overhead in the form of pointers. We present a dictionary data structure that makes use of both randomization and existing spaceefficient data structures to yield very low space overhead while maintaining distribution sensitivity in the expected sense. 1
Some Experimental Results on Data Structure Compression
"... Data structure is a vital concept in computer science. In the last decades it has been investigated from different points of views. However, one big problem with all these data structures that is too often overlooked is how horribly inefficient their space usage is. Nowadays, due to the exponential ..."
Abstract
 Add to MetaCart
Data structure is a vital concept in computer science. In the last decades it has been investigated from different points of views. However, one big problem with all these data structures that is too often overlooked is how horribly inefficient their space usage is. Nowadays, due to the exponential proliferation of information, especially the electronic data in the Internet environment and genome sequences in biology, there is an upsurging interest in the spaceconscious algorithms, which requires us to consider space efficiency in designing data structures more seriously than ever before. Huge amount of theoretical work has been done in data structure compressions; however, the implementations of any known compressed data structures are not fully investigated. We believe that compressed data structures may become a crucial tool for the design of sophisticated and efficient software solutions. Therefore it is important to develop the basic tools supporting compressed data structures. In this thesis, we focused on the efficient implementation of succinct data structures, especially the RANK and SELECT functions. We did thorough experiments concerning the memory overhead, memory hierarchy and execution time of these implementations. Our experimental results showed
Succinct and Implicit Data Structures for Computational Geometry
"... Abstract. Many classic data structures have been proposed to support geometric queries, such as range search, point location and nearest neighbor search. For a twodimensional geometric data set consisting of n elements, these structures typically require O(n), close to O(n) or O(n lg n) words of ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Many classic data structures have been proposed to support geometric queries, such as range search, point location and nearest neighbor search. For a twodimensional geometric data set consisting of n elements, these structures typically require O(n), close to O(n) or O(n lg n) words of space; while they support efficient queries, their storage costs are often much larger than the space required to encode the given data. As modern applications often process very large geometric data sets, it is often not practical to construct and store these data structures. This article surveys research that addresses this issue by designing spaceefficient geometric data structures. In particular, two different but closely related lines of research will be considered: succinct geometric data structures and implicit geometric data structures. The space usage of succinct geometric data structures is equal to the informationtheoretic minimum space required to encode the given geometric data set plus a lower order term, and these structures also answer queries efficiently. Implicit geometric data structures are encoded as permutations of elements in the data sets, and only zero or O(1) words of extra space is required to support queries. The succinct and implicit data structures surveyed in this article support several fundamental geometric queries and their variants. 1