Results 1 
6 of
6
Using Multiple Hash Functions to Improve IP Lookups
 IN PROCEEDINGS OF IEEE INFOCOM
, 2000
"... High performance Internet routers require a mechanism for very efficient IP address lookups. Some techniques used to this end, such as binary search on levels, need to construct quickly a good hash table for the appropriate IP prefixes. In this paper we describe an approach for obtaining good hash ..."
Abstract

Cited by 79 (11 self)
 Add to MetaCart
High performance Internet routers require a mechanism for very efficient IP address lookups. Some techniques used to this end, such as binary search on levels, need to construct quickly a good hash table for the appropriate IP prefixes. In this paper we describe an approach for obtaining good hash tables based on using multiple hashes of each input key (which is an IP address). The methods we describe are fast, simple, scalable, parallelizable, and flexible. In particular, in instances where the goal is to have one hash bucket fit into a cache line, using multiple hashes proves extremely suitable. We provide a general analysis of this hashing technique and specifically discuss its application to binary search on levels.
Cuckoo hashing: Further analysis
, 2003
"... We consider cuckoo hashing as proposed by Pagh and Rodler in 2001. We show that the expected construction time of the hash table is O(n) as long as the two open addressing tables are each of size at least (1 #)n,where#>0andn is the number of data points. Slightly improved bounds are obtained f ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
(Show Context)
We consider cuckoo hashing as proposed by Pagh and Rodler in 2001. We show that the expected construction time of the hash table is O(n) as long as the two open addressing tables are each of size at least (1 #)n,where#>0andn is the number of data points. Slightly improved bounds are obtained for various probabilities and constraints. The analysis rests on simple properties of branching processes.
Almost All Graphs With Average Degree 4 Are 3Colorable
 In Proc. STOC
, 2002
"... We analyze a randomized version of the Brelaz heuristic on sparse random graphs. We prove that almost all graphs with average degree d 4.03, i.e. G(n, p = d/n), are 3colorable and that a constant fraction of all 4regular graphs are 3colorable. ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
We analyze a randomized version of the Brelaz heuristic on sparse random graphs. We prove that almost all graphs with average degree d 4.03, i.e. G(n, p = d/n), are 3colorable and that a constant fraction of all 4regular graphs are 3colorable.
The Asymptotics of Selecting the Shortest of Two, Improved
 UNIVERSITY OF ILLINOIS
, 1999
"... We investigate variations of a novel, recently proposed load balancing scheme based on small amounts of choice. The static setting is modeled as a ballsandbins process. The balls are sequentially placed into bins, with each ball selecting d bins randomly and going to the bin with the fewest balls. ..."
Abstract

Cited by 25 (11 self)
 Add to MetaCart
(Show Context)
We investigate variations of a novel, recently proposed load balancing scheme based on small amounts of choice. The static setting is modeled as a ballsandbins process. The balls are sequentially placed into bins, with each ball selecting d bins randomly and going to the bin with the fewest balls. A similar dynamic setting is modeled as a scenario where tasks arrive as a Poisson process at a bank of FIFO servers and queue at one for service. Tasks probe a small random sample of servers in the bank and queue at the server with the fewest tasks. Recently
The Asymptotics of Selecting the Shortest of Two, Improved
 Harvard Computer Science
, 1999
"... We investigate variations of a novel, recently proposed load balancing scheme based on small amounts of choice. The static (hashing) setting is modeled as a ballsandbins process. The balls are sequentially placed into bins, with each ball selecting d bins randomly and going to the bin with the few ..."
Abstract
 Add to MetaCart
(Show Context)
We investigate variations of a novel, recently proposed load balancing scheme based on small amounts of choice. The static (hashing) setting is modeled as a ballsandbins process. The balls are sequentially placed into bins, with each ball selecting d bins randomly and going to the bin with the fewest balls. A similar dynamic setting is modeled as a scenario where tasks arrive as a Poisson process at a bank of FIFO servers and queue at one for service. Tasks probe a small random sample of servers in the bank and queue at the server with the fewest tasks.
The Point of Point Crossover: Shuing To Randomness
"... The action of point crossover is modeled as a random walk on a group, and convergence and rate results are established for the walk. Speci cally, it is shown that there is a cuto phenomenon in the rate at which the sample get randomized. As long as the number of crossover steps is less than a certa ..."
Abstract
 Add to MetaCart
(Show Context)
The action of point crossover is modeled as a random walk on a group, and convergence and rate results are established for the walk. Speci cally, it is shown that there is a cuto phenomenon in the rate at which the sample get randomized. As long as the number of crossover steps is less than a certain critical number, the total variation distance (with respect to the stationary distribution) is large, and remains essentially constant. But once the critical number has been crossed, the total variation distance goes to zero (at an exponential rate). The cuto number of steps is of order of O(lN lnN) steps, where N is the sample size, and l is the length of the chromosome. Finally, it is shown by heuristic arguments as well as by simulations, that if a statistical criterion such as Kendall'sW coefcient or the average Kendall's coeÆcient is used to measure randomness (rather than total variation distance), the sample can be said to be random (upto statistical signicance) in O(lnN) steps, rather than O(lN lnN) steps. The properties of such criteria are characterized. 1