Results 1  10
of
17
Searching Constant Width Mazes Captures the AC° Hierarchy
 In Proceedings of the 15th Annual Symposium on Theoretical Aspects of Computer Science
, 1997
"... We show that searching a width /' maze is complete for II, i.e., for the /"th level of the AC hierarchy. Equivalently, stconnectivity for width /' grid graphs is complete for II. As an application, we show that there is a data structure solving dynamic stconnectivity for constan ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
We show that searching a width /' maze is complete for II, i.e., for the /"th level of the AC hierarchy. Equivalently, stconnectivity for width /' grid graphs is complete for II. As an application, we show that there is a data structure solving dynamic stconnectivity for constant width grid graphs with time bound O (log log n) per operation on a random access machine. The dynamic algorithm is derived from the parallel one in an indirect way using algebraic tools.
Relational Reasoning about Contexts
 HIGHER ORDER OPERATIONAL TECHNIQUES IN SEMANTICS, PUBLICATIONS OF THE NEWTON INSTITUTE
, 1998
"... ..."
Dynamic Representations of Sparse Graphs
 In Proc. 6th International Workshop on Algorithms and Data Structures (WADS
, 1999
"... We present a linear space data structure for maintaining graphs with bounded arboricity  a large class of sparse graphs containing e.g. planar graphs and graphs of bounded treewidth  under edge insertions, edge deletions, and adjacency queries. The data structure supports adjacency queries in wors ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
We present a linear space data structure for maintaining graphs with bounded arboricity  a large class of sparse graphs containing e.g. planar graphs and graphs of bounded treewidth  under edge insertions, edge deletions, and adjacency queries. The data structure supports adjacency queries in worst case O(c) time, and edge insertions and edge deletions in amortized O(1) and O(c+log n) time, respectively, where n is the number of nodes in the graph, and c is the bound on the arboricity.
Faster Deterministic Dictionaries
 In 11 th Annual ACM Symposium on Discrete Algorithms (SODA
, 1999
"... We consider static dictionaries over the universe U = on a unitcost RAM with word size w. Construction of a static dictionary with linear space consumption and constant lookup time can be done in linear expected time by a randomized algorithm. In contrast, the best previous deterministic a ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
We consider static dictionaries over the universe U = on a unitcost RAM with word size w. Construction of a static dictionary with linear space consumption and constant lookup time can be done in linear expected time by a randomized algorithm. In contrast, the best previous deterministic algorithm for constructing such a dictionary with n elements runs in time O(n ) for # > 0. This paper narrows the gap between deterministic and randomized algorithms exponentially, from the factor of to an O(log n) factor. The algorithm is weakly nonuniform, i.e. requires certain precomputed constants dependent on w. A byproduct of the result is a lookup time vs insertion time tradeo# for dynamic dictionaries, which is optimal for a certain class of deterministic hashing schemes.
Hash and displace: Efficient evaluation of minimal perfect hash functions
 In Workshop on Algorithms and Data Structures
, 1999
"... A new way of constructing (minimal) perfect hash functions is described. The technique considerably reduces the overhead associated with resolving buckets in twolevel hashing schemes. Evaluating a hash function requires just one multiplication and a few additions apart from primitive bit operations ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
A new way of constructing (minimal) perfect hash functions is described. The technique considerably reduces the overhead associated with resolving buckets in twolevel hashing schemes. Evaluating a hash function requires just one multiplication and a few additions apart from primitive bit operations. The number of accesses to memory is two, one of which is to a fixed location. This improves the probe performance of previous minimal perfect hashing schemes, and is shown to be optimal. The hash function description (“program”) for a set of size n occupies O(n) words, and can be constructed in expected O(n) time. 1
HistoryIndependent Cuckoo Hashing
"... Cuckoo hashing is an efficient and practical dynamic dictionary. It provides expected amortized constant update time, worst case constant lookup time, and good memory utilization. Various experiments demonstrated that cuckoo hashing is highly suitable for modern computer architectures and distribute ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
Cuckoo hashing is an efficient and practical dynamic dictionary. It provides expected amortized constant update time, worst case constant lookup time, and good memory utilization. Various experiments demonstrated that cuckoo hashing is highly suitable for modern computer architectures and distributed settings, and offers significant improvements compared to other schemes. In this work we construct a practical historyindependent dynamic dictionary based on cuckoo hashing. In a historyindependent data structure, the memory representation at any point in time yields no information on the specific sequence of insertions and deletions that led to its current content, other than the content itself. Such a property is significant when preventing unintended leakage of information, and was also found useful in several algorithmic settings. Our construction enjoys most of the attractive properties of cuckoo hashing. In particular, no dynamic memory allocation is required, updates are performed in expected amortized constant time, and membership queries are performed in worst case constant time. Moreover, with high probability, the lookup procedure queries only two memory entries which are independent and can be queried in parallel. The approach underlying our construction is to enforce a canonical memory representation on cuckoo hashing. That is, up to the initial randomness, each set of elements has a unique memory representation.
A TradeOff For WorstCase Efficient Dictionaries
"... We consider dynamic dictionaries over the universe U = {0, 1}^w on a unitcost RAM with word size w and a standard instruction set, and present a linear space deterministic dictionary accommodating membership queries in time (log log n)^O(1) and updates in time (log n)^O(1), where n is the size of t ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We consider dynamic dictionaries over the universe U = {0, 1}^w on a unitcost RAM with word size w and a standard instruction set, and present a linear space deterministic dictionary accommodating membership queries in time (log log n)^O(1) and updates in time (log n)^O(1), where n is the size of the set stored. Previous solutions either had query time (log n) 18 or update time 2 !( p log n) in the worst case.
Streaming Computation of Combinatorial Objects
 In Proceedings of the Seventeenth Annual IEEE Conference on Computational Complexity
, 2002
"... We prove (mostly tight) space lower bounds for "streaming " (or "online") computations of four fundamental combinatorial objects: errorcorrecting codes, universal hash functions, extractors, and dispersers. Streaming computations for these objects are motivated algorithmically ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We prove (mostly tight) space lower bounds for "streaming " (or "online") computations of four fundamental combinatorial objects: errorcorrecting codes, universal hash functions, extractors, and dispersers. Streaming computations for these objects are motivated algorithmically by massive data set applications and complexitytheoretically by pseudorandomness and derandomization for spacebounded probabilistic algorithms.