Results 1 
5 of
5
Cell probe complexity  a survey
 In 19th Conference on the Foundations of Software Technology and Theoretical Computer Science (FSTTCS), 1999. Advances in Data Structures Workshop
"... The cell probe model is a general, combinatorial model of data structures. We give a survey of known results about the cell probe complexity of static and dynamic data structure problems, with an emphasis on techniques for proving lower bounds. 1 ..."
Abstract

Cited by 33 (0 self)
 Add to MetaCart
(Show Context)
The cell probe model is a general, combinatorial model of data structures. We give a survey of known results about the cell probe complexity of static and dynamic data structure problems, with an emphasis on techniques for proving lower bounds. 1
Static Dictionaries on AC^0 RAMs: Query time Θ(,/log n / log log n) is necessary and sufficient
, 1996
"... In this paper we consider solutions to the static dictionary problem ���� � on RAMs, i.e. random access machines where the only restriction on the finite instruction set is that all computational instructions are ���� � in. Our main result is a tight upper and lower bound ���� � ���©���������������� ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
In this paper we consider solutions to the static dictionary problem ���� � on RAMs, i.e. random access machines where the only restriction on the finite instruction set is that all computational instructions are ���� � in. Our main result is a tight upper and lower bound ���� � ���©��������������������� of on the time for answering membership queries in a set of � size when reasonable space is used for the data structure storing the set; the upper bound can be obtained using space ������ � �� � ���� �. Several variations of this result are also obtained. Among others, we show a tradeoff between time and circuit depth under the unitcost assumption: any RAM instruction set which permits a linear space, constant query time solution to the static dictionary problem must have an instruction of depth �������©���������������©���� � , where � is the word size of the machine (and ���© � the size of the universe). This matches the depth of multiplication and integer division, used in the perfect hashing scheme by Fredman, Komlós and Szemerédi.
Adjacency queries in dynamic sparse graphs
 Inf. Process. Lett
, 2007
"... We deal with the problem of maintaining a dynamic graph so that queries of the form “is there an edge between u and v? ” are processed fast. We consider graphs of bounded arboricity, i.e., graphs with no dense subgraphs, like for example planar graphs. Brodal and Fagerberg [WADS’99] described a very ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
We deal with the problem of maintaining a dynamic graph so that queries of the form “is there an edge between u and v? ” are processed fast. We consider graphs of bounded arboricity, i.e., graphs with no dense subgraphs, like for example planar graphs. Brodal and Fagerberg [WADS’99] described a very simple linearsize data structure which processes queries in constant worstcase time and performs insertions and deletions in O(1) and O(log n) amortized time, respectively. We show a complementary result that their data structure can be used to get O(log n) worstcase time for query, O(1) amortized time for insertions and O(1) worstcase time for deletions. Moreover, our analysis shows that by combining the data structure of Brodal and Fagerberg with efficient dictionaries one gets O(log log logn) worstcase time bound for queries and deletions and O(log log logn) amortized time for insertions, with size of the data structure still linear. This last result holds even for graphs of arboricity bounded by O(logk n), for some constant k.
Dynamic Dictionaries in Constant WorstCase Time
"... We introduce a technique to maintain a set of n elements from a universe of size u with membership and indel operations, so that elements are associated rbit satellite data. We achieve constant worstcase time for all the operations, at the price of spending u + o(u) + O(nr + n log log log u) bits ..."
Abstract
 Add to MetaCart
(Show Context)
We introduce a technique to maintain a set of n elements from a universe of size u with membership and indel operations, so that elements are associated rbit satellite data. We achieve constant worstcase time for all the operations, at the price of spending u + o(u) + O(nr + n log log log u) bits of space. Only the variant where the space is of the form O(nr + n log u) was exhaustively explored before, yet in that case existing lower bounds prevent achieving constant worstcase times. As a byproduct, we improve a folklore data structure for initializing an array of n elements in constant time, by reducing its space requirement from 2n log n to n+o(n) bits. Key words: Algorithms and data structures, succinct data structures, dynamic perfect hashing, dynamic dictionaries with satellite information. 1 Introduction and Related Work One of the most basic algorithmic problems is that of maintaining a set of (key, value) pairs, so as to retrieve the value associated to a key (or determine
Θ ( �
, 909
"... Reproduction of all or part of this work is permitted for educational or research use on condition that this copyright notice is included in any copy. See back inner page for a list of recent BRICS Report Series publications. Copies may be obtained by contacting: BRICS ..."
Abstract
 Add to MetaCart
(Show Context)
Reproduction of all or part of this work is permitted for educational or research use on condition that this copyright notice is included in any copy. See back inner page for a list of recent BRICS Report Series publications. Copies may be obtained by contacting: BRICS