Results 1  10
of
10
The Power of Two Random Choices: A Survey of Techniques and Results
 in Handbook of Randomized Computing
, 2000
"... ITo motivate this survey, we begin with a simple problem that demonstrates a powerful fundamental idea. Suppose that n balls are thrown into n bins, with each ball choosing a bin independently and uniformly at random. Then the maximum load, or the largest number of balls in any bin, is approximately ..."
Abstract

Cited by 99 (2 self)
 Add to MetaCart
ITo motivate this survey, we begin with a simple problem that demonstrates a powerful fundamental idea. Suppose that n balls are thrown into n bins, with each ball choosing a bin independently and uniformly at random. Then the maximum load, or the largest number of balls in any bin, is approximately log n= log log n with high probability. Now suppose instead that the balls are placed sequentially, and each ball is placed in the least loaded of d 2 bins chosen independently and uniformly at random. Azar, Broder, Karlin, and Upfal showed that in this case, the maximum load is log log n= log d + (1) with high probability [ABKU99]. The important implication of this result is that even a small amount of choice can lead to drastically different results in load balancing. Indeed, having just two random choices (i.e.,...
Using Multiple Hash Functions to Improve IP Lookups
 IN PROCEEDINGS OF IEEE INFOCOM
, 2000
"... High performance Internet routers require a mechanism for very efficient IP address lookups. Some techniques used to this end, such as binary search on levels, need to construct quickly a good hash table for the appropriate IP prefixes. In this paper we describe an approach for obtaining good hash ..."
Abstract

Cited by 68 (11 self)
 Add to MetaCart
High performance Internet routers require a mechanism for very efficient IP address lookups. Some techniques used to this end, such as binary search on levels, need to construct quickly a good hash table for the appropriate IP prefixes. In this paper we describe an approach for obtaining good hash tables based on using multiple hashes of each input key (which is an IP address). The methods we describe are fast, simple, scalable, parallelizable, and flexible. In particular, in instances where the goal is to have one hash bucket fit into a cache line, using multiple hashes proves extremely suitable. We provide a general analysis of this hashing technique and specifically discuss its application to binary search on levels.
Lower bounds for random 3SAT via differential equations
 THEORETICAL COMPUTER SCIENCE
, 2001
"... ..."
Setting 2 variables at a time yields a new lower bound for random 3SAT (Extended Abstract)
 STOC
, 2000
"... Let X be a set of n Boolean variables and denote by C(X) the set of all 3clauses over X, i.e. the set of all 8(3) possible disjunctions of three distinct, noncomplementary literais from variables in X. Let F(n, m) be a random 3SAT formula formed by selecting, with replacement, m clauses uniformly ..."
Abstract

Cited by 35 (4 self)
 Add to MetaCart
Let X be a set of n Boolean variables and denote by C(X) the set of all 3clauses over X, i.e. the set of all 8(3) possible disjunctions of three distinct, noncomplementary literais from variables in X. Let F(n, m) be a random 3SAT formula formed by selecting, with replacement, m clauses uniformly at random from C(X) and taking their conjunction. The satisfiability threshold conjecture asserts that there exists a constant ra such that as n+ c¢, F(n, rn) is satisfiable with probability that tends to 1 if r < ra, but unsatisfiable with probability that tends to 1 if r:> r3. Experimental evidence suggests rz ~ 4.2. We prove rz> 3.145 improving over the previous best lower bound r3> 3.003 due to Frieze and Suen. For this, we introduce a satisfiability heuristic that works iteratively, permanently setting the value of a pair of variables in each round. The framework we develop for the analysis of our heuristic allows us to also derive most previous lower bounds for random 3SAT in a uniform manner and with little effort.
Mean FDE Models for Internet Congestion Control Under a ManyFlows Regime
 IEEE Transactions on Information Theory
, 2001
"... Congestion control algorithms used in the Internet are difficult to analyze or simulate on a large scale, i.e., when there are large numbers of nodes, links and sources in a network. The reasons for this include the complexity of the actual implementation of the algorithm and the randomness introduc ..."
Abstract

Cited by 30 (11 self)
 Add to MetaCart
Congestion control algorithms used in the Internet are difficult to analyze or simulate on a large scale, i.e., when there are large numbers of nodes, links and sources in a network. The reasons for this include the complexity of the actual implementation of the algorithm and the randomness introduced in the packet arrival and service processes due to many factors such as arrivals and departures of sources and uncontrollable short flows in the network. To make the analysis or simulation tractable, often deterministic fluid approximations of these algorithms are used. These approximations are in the form of either deterministic delay differential equations, or more generally, deterministic functional differential equations (FDEs). In this paper, we ignore the complexity introduced by the windowbased implementation of such algorithms and focus on the randomness in the network. We justify the use of deterministic models for proportionallyfair congestion controllers under a limiting regime where the number of flows in a network is large.
Analyses of Load Stealing Models Based on Differential Equations
 In Proceedings of the 10th Annual ACM Symposium on Parallel Algorithms and Architectures
, 1998
"... In this paper we develop models for and analyze several randomized work stealing algorithms in a dynamic setting. Our models represent the limiting behavior of systems as the number of processors grows to infinity using differential equations. The advantages of this approach include the ability to m ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
In this paper we develop models for and analyze several randomized work stealing algorithms in a dynamic setting. Our models represent the limiting behavior of systems as the number of processors grows to infinity using differential equations. The advantages of this approach include the ability to model a large variety of systems and to provide accurate numerical approximations of system behavior even when the number of processors is relatively small. We show how this approach can yield significant intuition about the behavior of work stealing algorithms in realistic settings.
The Asymptotics of Selecting the Shortest of Two, Improved
 UNIVERSITY OF ILLINOIS
, 1999
"... We investigate variations of a novel, recently proposed load balancing scheme based on small amounts of choice. The static setting is modeled as a ballsandbins process. The balls are sequentially placed into bins, with each ball selecting d bins randomly and going to the bin with the fewest balls. ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
We investigate variations of a novel, recently proposed load balancing scheme based on small amounts of choice. The static setting is modeled as a ballsandbins process. The balls are sequentially placed into bins, with each ball selecting d bins randomly and going to the bin with the fewest balls. A similar dynamic setting is modeled as a scenario where tasks arrive as a Poisson process at a bank of FIFO servers and queue at one for service. Tasks probe a small random sample of servers in the bank and queue at the server with the fewest tasks. Recently
On the construction of lyapunov functions for nonlinear markov processes via relative entropy. submitted for publication
, 2011
"... We develop an approach to the construction of Lyapunov functions for the forward equation of a nite state nonlinear Markov process. Nonlinear Markov processes can be obtained as a law of large number limit for a system of weakly interacting processes. The approach exploits this connection and the fa ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We develop an approach to the construction of Lyapunov functions for the forward equation of a nite state nonlinear Markov process. Nonlinear Markov processes can be obtained as a law of large number limit for a system of weakly interacting processes. The approach exploits this connection and the fact that relative entropy de nes a Lyapunov function for the solution of the forward equation for the many particle system. Candidate Lyapunov functions for the nonlinear Markov process are constructed via limits, and veri ed for certain classes of models. 1
Infinite Parallel Job Allocation (Extended Abstract)
, 2000
"... ) Petra Berenbrink Dept. of Mathematics & Computer Science Paderborn University D33095 Paderborn, Germany pebe@unipaderborn.de Artur Czumaj y Department of Computer and Information Science New Jersey Institute of Technology University Heights, Newark, NJ 071021982, USA czumaj@cis.njit. ..."
Abstract
 Add to MetaCart
) Petra Berenbrink Dept. of Mathematics & Computer Science Paderborn University D33095 Paderborn, Germany pebe@unipaderborn.de Artur Czumaj y Department of Computer and Information Science New Jersey Institute of Technology University Heights, Newark, NJ 071021982, USA czumaj@cis.njit.edu Tom Friedetzky Institut fur Informatik Technische Universitat Munchen D80290 Munchen, Germany friedetz@informatik.tumuenchen.de Nikita D. Vvedenskaya Institute of Information Transmission Problems Russian Academy of Science Moscow 101447, Russia ndv@iitp.ru Abstract In recent years, the task of allocating jobs to servers has been studied with the \balls and bins" abstraction. Results in this area exploit the large decrease in maximum load that can be achieved by allowing each job (ball) a little freedom in choosing its destination server (bin). In this paper we examine an innite and parallel allocation process (see [ABS98]) which is related to the \balls and bins" abs...
Some Notes on Random Satisfiability
, 2001
"... 3SAT is a canonical NPcomplete problem: satisfiable and unsatisfiable instances cannot generally be distinguished in polynomial time. However, random 3SAT formulas show a phase transition: for any large number of variables n, sparse random formulas (with m ≤ 3.145n clauses) are almost always s ..."
Abstract
 Add to MetaCart
3SAT is a canonical NPcomplete problem: satisfiable and unsatisfiable instances cannot generally be distinguished in polynomial time. However, random 3SAT formulas show a phase transition: for any large number of variables n, sparse random formulas (with m ≤ 3.145n clauses) are almost always satisfiable, dense ones (with m ≥ 4.596n clauses) are almost always unsatisfiable, and the transition occurs sharply when m=n crosses some threshold. It is believed that the limiting threshold is around 4.2, but it is not even known that a limit exists. Proofs of the satisfiability...