Results 1 
5 of
5
Cuckoo hashing
 JOURNAL OF ALGORITHMS
, 2001
"... We present a simple dictionary with worst case constant lookup time, equaling the theoretical performance of the classic dynamic perfect hashing scheme of Dietzfelbinger et al. (Dynamic perfect hashing: Upper and lower bounds. SIAM J. Comput., 23(4):738–761, 1994). The space usage is similar to that ..."
Abstract

Cited by 170 (6 self)
 Add to MetaCart
We present a simple dictionary with worst case constant lookup time, equaling the theoretical performance of the classic dynamic perfect hashing scheme of Dietzfelbinger et al. (Dynamic perfect hashing: Upper and lower bounds. SIAM J. Comput., 23(4):738–761, 1994). The space usage is similar to that of binary search trees, i.e., three words per key on average. Besides being conceptually much simpler than previous dynamic dictionaries with worst case constant lookup time, our data structure is interesting in that it does not use perfect hashing, but rather a variant of open addressing where keys can be moved back in their probe sequences. An implementation inspired by our algorithm, but using weaker hash functions, is found to be quite practical. It is competitive with the best known dictionaries having an average case (but no nontrivial worst case) guarantee.
Exact Analyses of a Simple Heuristic Employed In Array Compression
, 2002
"... ... this paper is to precisely analyse the behaviour of one such extremely simple heuristic which is known to give modest compression in practice. For the heuristic we prove that the expected asymptotic space requirement is, at worst, a(k)n+ b(k)x and that although its dependency on n is inherent ..."
Abstract
 Add to MetaCart
... this paper is to precisely analyse the behaviour of one such extremely simple heuristic which is known to give modest compression in practice. For the heuristic we prove that the expected asymptotic space requirement is, at worst, a(k)n+ b(k)x and that although its dependency on n is inherent, it can be made arbitrarily small. Here k is a parameter and a(k) and b(k) are, respectively, monotonically decreasing and increasing functions. Thus k allows a tradeoff between dependency on n and x;for example, pairs (a(k), b(k)) can be (0.1, 3.26), (0.03, 5.57) and (6 10 4 , 33). We also show that for some applications the dependency of the space requirement on n canbemadesublinear. The heuristic allows constant time access to any element. Our analyses are over two different models for the uniform probability distribution and we derive exact formulae for the expected space used. We prove that the heuristic gives the same asymptotic performance in both models
A reliable randomized algorithm for the . . .
, 1997
"... The following two computational problems are studied: Duplicate grouping: Assume that n items are given, each of which is labeled by an integer key from the set 0,..., U � 1 4. Store the items in an array of size n such that items with the same key occupy a contiguous segment of the array. Closest p ..."
Abstract
 Add to MetaCart
The following two computational problems are studied: Duplicate grouping: Assume that n items are given, each of which is labeled by an integer key from the set 0,..., U � 1 4. Store the items in an array of size n such that items with the same key occupy a contiguous segment of the array. Closest pair: Assume that a multiset of n points in the ddimensional Euclidean space is given, where d � 1 is a fixed integer. Each point is represented as a dtuple of integers in the range 0,..., U � 14 Ž or of arbitrary real numbers.. Find a closest pair, i.e., a pair of points whose distance is minimal over all such pairs.