Results 1  10
of
10
Tight thresholds for cuckoo hashing via XORSAT
, 2010
"... We settle the question of tight thresholds for offline cuckoo hashing. The problem can be stated as follows: we have n keys to be hashed into m buckets each capable of holding a single key. Each key has k ≥ 3 (distinct) associated buckets chosen uniformly at random and independently of the choices ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
We settle the question of tight thresholds for offline cuckoo hashing. The problem can be stated as follows: we have n keys to be hashed into m buckets each capable of holding a single key. Each key has k ≥ 3 (distinct) associated buckets chosen uniformly at random and independently of the choices of other keys. A hash table can be constructed successfully if each key can be placed into one of its buckets. We seek thresholds ck such that, as n goes to infinity, if n/m ≤ c for some c < ck then a hash table can be constructed successfully with high probability, and if n/m ≥ c for some c> ck a hash table cannot be constructed successfully with high probability. Here we are considering the offline version of the problem, where all keys and hash values are given, so the problem is equivalent to previous models of multiplechoice hashing. We find the thresholds for all values of k> 2 by showing that they are in fact the same as the previously known thresholds for the random kXORSAT problem. We then extend these results to the setting where keys can have differing number of choices, and provide evidence in the form of an algorithm for a conjecture extending this result to cuckoo hash tables that store multiple keys in a bucket.
Succinct Data Structures for Retrieval and Approximate Membership
"... Abstract. The retrieval problem is the problem of associating data with keys in a set. Formally, the data structure must store a function f: U → {0, 1} r that has specified values on the elements of a given set S ⊆ U, S  = n, but may have any value on elements outside S. All known methods (e. g. ..."
Abstract

Cited by 13 (6 self)
 Add to MetaCart
Abstract. The retrieval problem is the problem of associating data with keys in a set. Formally, the data structure must store a function f: U → {0, 1} r that has specified values on the elements of a given set S ⊆ U, S  = n, but may have any value on elements outside S. All known methods (e. g. those based on perfect hash functions), induce a space overhead of Θ(n) bits over the optimum, regardless of the evaluation time. We show that for any k, query time O(k) can be achieved using space that is within a factor 1 + e −k of optimal, asymptotically for large n. The time to construct the data structure is O(n), expected. If we allow logarithmic evaluation time, the additive overhead can be reduced to O(log log n) bits whp. A general reduction transfers the results on retrieval into analogous results on approximate membership, a problem traditionally addressed using Bloom filters. Thus we obtain space bounds arbitrarily close to the lower bound for this problem as well. The evaluation procedures of our data structures are extremely simple. For the results stated above we assume free access to fully random hash functions. This assumption can be justified using space o(n) to simulate full randomness on a RAM. 1
The Random Graph Threshold for korientiability and a Fast Algorithm for Optimal MultipleChoice Allocation
, 2007
"... We investigate a linear time greedy algorithm for the following load balancing problem: Assign m balls to n bins such that the maximum occupancy is minimized. Each ball can be placed into one of two randomly choosen bins. This problem is closely related to the problem of orienting the edges of an un ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
We investigate a linear time greedy algorithm for the following load balancing problem: Assign m balls to n bins such that the maximum occupancy is minimized. Each ball can be placed into one of two randomly choosen bins. This problem is closely related to the problem of orienting the edges of an undirected graph to obtain a directed graph with minimum indegree. Using differential equation methods, we derive thresholds for the solution quality achieved by our algorithm. Since these thresholds coincide with lower bounds for the achievable solution quality, this proves the optimality of our algorithm (as n → ∞, in a probabilistic sense) and establishes the thresholds for korientability of random graphs. This proves an assertion of Karp and Saks.
Backyard Cuckoo Hashing: Constant WorstCase Operations with a Succinct Representation
, 2010
"... The performance of a dynamic dictionary is measured mainly by its update time, lookup time, and space consumption. In terms of update time and lookup time there are known constructions that guarantee constanttime operations in the worst case with high probability, and in terms of space consumption ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
The performance of a dynamic dictionary is measured mainly by its update time, lookup time, and space consumption. In terms of update time and lookup time there are known constructions that guarantee constanttime operations in the worst case with high probability, and in terms of space consumption there are known constructions that use essentially optimal space. In this paper we settle two fundamental open problems: • We construct the first dynamic dictionary that enjoys the best of both worlds: we present a twolevel variant of cuckoo hashing that stores n elements using (1+ϵ)n memory words, and guarantees constanttime operations in the worst case with high probability. Specifically, for any ϵ = Ω((log log n / log n) 1/2) and for any sequence of polynomially many operations, with high probability over the randomness of the initialization phase, all operations are performed in constant time which is independent of ϵ. The construction is based on augmenting cuckoo hashing with a “backyard ” that handles a large fraction of the elements, together with a deamortized perfect hashing scheme for eliminating the dependency on ϵ.
Load Balancing and Orientability Thresholds for Random Hypergraphs
"... Let h> w> 0 be two fixed integers. Let H be a random hypergraph whose hyperedges are uniformly of size h. To worient a hyperedge, we assign exactly w of its vertices positive signs with respect to the hyperedge, and the rest negative. A (w, k)orientation of H consists of a worientation of all hyp ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Let h> w> 0 be two fixed integers. Let H be a random hypergraph whose hyperedges are uniformly of size h. To worient a hyperedge, we assign exactly w of its vertices positive signs with respect to the hyperedge, and the rest negative. A (w, k)orientation of H consists of a worientation of all hyperedges of H, such that each vertex receives at most k positive signs from its incident hyperedges. When k is large enough, we determine the threshold of the existence of a (w, k)orientation of a random hypergraph. The (w, k)orientation of hypergraphs is strongly related to a general version of the offline load balancing problem. The graph case, when h = 2 and w = 1, was solved recently by Cain, Sanders and Wormald and independently by Fernholz and Ramachandran, thereby settling a conjecture made by Karp and Saks. Motivated by a problem of cuckoo hashing, the special hypergraph case with w = k = 1, was solved in three separate preprints dating from October 2009, by Frieze and Melsted, by Fountoulakis and
Some Open Questions Related to Cuckoo Hashing
"... Abstract. The purpose of this brief note is to describe recent work in the area of cuckoo hashing, including a clear description of several open problems, with the hope of spurring further research. 1 ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract. The purpose of this brief note is to describe recent work in the area of cuckoo hashing, including a clear description of several open problems, with the hope of spurring further research. 1
Orientability thresholds for random hypergraphs
"... Let h> w> 0 be two fixed integers. Let H be a random hypergraph whose hyperedges are all of cardinality h. To worient a hyperedge, we assign exactly w of its vertices positive signs with respect to the hyperedge, and the rest negative. A (w, k)orientation of H consists of a worientation of all hy ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Let h> w> 0 be two fixed integers. Let H be a random hypergraph whose hyperedges are all of cardinality h. To worient a hyperedge, we assign exactly w of its vertices positive signs with respect to the hyperedge, and the rest negative. A (w, k)orientation of H consists of a worientation of all hyperedges of H, such that each vertex receives at most k positive signs from its incident hyperedges. When k is large enough, we determine the threshold of the existence of a (w, k)orientation of a random hypergraph. The (w, k)orientation of hypergraphs is strongly related to a general version of the offline load balancing problem. The graph case, when h = 2 and w = 1, was solved recently by Cain, Sanders and Wormald and independently by Fernholz and Ramachandran, which settled a conjecture of Karp and Saks. 1
Sparse Random Graphs Methods, Structure, and Heuristics
, 2007
"... This dissertation is an algorithmic study of sparse random graphs which are parametrized by the distribution of vertex degrees. Our contributions include: a formula for the diameter of various sparse random graphs, including the ErdősRényi random graphs Gn,m and Gn,p and certain powerlaw graphs; a ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This dissertation is an algorithmic study of sparse random graphs which are parametrized by the distribution of vertex degrees. Our contributions include: a formula for the diameter of various sparse random graphs, including the ErdősRényi random graphs Gn,m and Gn,p and certain powerlaw graphs; a heuristic for the korientability problem, which performs optimally for certain classes of random graphs, again including the ErdősRényi models Gn,m and Gn,p; an improved lower bound for the independence ratio of random 3regular graphs. In addition to these structural results, we also develop a technique for reasoning abstractly about random graphs by representing discrete structures topologically.
Coloring the edges of a random graph without a monochromatic giant component
 Electronic Notes in Discrete Mathematics, 34:615 – 619, 2009. European Conference on Combinatorics, Graph Theory and Applications (EuroComb 2009). the electronic journal of combinatorics 17 (2010), #R133 7
"... We study the following two problems: i) Given a random graph Gn,m (a graph drawn uniformly at random from all graphs on n vertices with exactly m edges), can we color its edges with r colors such that no color class contains a component of size Θ(n)? ii) Given a random graph Gn,m with a random parti ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We study the following two problems: i) Given a random graph Gn,m (a graph drawn uniformly at random from all graphs on n vertices with exactly m edges), can we color its edges with r colors such that no color class contains a component of size Θ(n)? ii) Given a random graph Gn,m with a random partition of its edge set into sets of size r, can we color its edges with r colors subject to the restriction that every color is used for exactly one edge in every set of the partition such that no color class contains a component of size Θ(n)? We prove that for any fixed r � 2, in both settings the (sharp) threshold for the existence of such a coloring coincides with the known threshold for rorientability of Gn,m, which is at m = rc ∗ rn for some analytically computable constant c ∗ r. The fact that the two problems have the same threshold is in contrast with known results for the two corresponding Achlioptastype problems. 1
Ramsey games with giants
"... The classical result in the theory of random graphs, proved by Erdős and Rényi in 1960, concerns the threshold for the appearance of the giant component in the random graph process. We consider a variant of this problem, with a Ramsey flavor. Now, each random edge that arrives in the sequence of rou ..."
Abstract
 Add to MetaCart
The classical result in the theory of random graphs, proved by Erdős and Rényi in 1960, concerns the threshold for the appearance of the giant component in the random graph process. We consider a variant of this problem, with a Ramsey flavor. Now, each random edge that arrives in the sequence of rounds must be colored with one of r colors. The goal can be either to create a giant component in every color class, or alternatively, to avoid it in every color. One can analyze the offline or online setting for this problem. In this paper, we consider all these variants and provide nontrivial upper and lower bounds; in certain cases (like online avoidance) the obtained bounds are asymptotically tight. 1