Results 1  10
of
71
Efficient erasure correcting codes
 IEEE Transactions on Information Theory
, 2001
"... Abstract—We introduce a simple erasure recovery algorithm for codes derived from cascades of sparse bipartite graphs and analyze the algorithm by analyzing a corresponding discretetime random process. As a result, we obtain a simple criterion involving the fractions of nodes of different degrees on ..."
Abstract

Cited by 250 (20 self)
 Add to MetaCart
Abstract—We introduce a simple erasure recovery algorithm for codes derived from cascades of sparse bipartite graphs and analyze the algorithm by analyzing a corresponding discretetime random process. As a result, we obtain a simple criterion involving the fractions of nodes of different degrees on both sides of the graph which is necessary and sufficient for the decoding process to finish successfully with high probability. By carefully designing these graphs we can construct for any given rate and any given real number a family of linear codes of rate which can be encoded in time proportional to ��@I A times their block length. Furthermore, a codeword can be recovered with high probability from a portion of its entries of length @IC A or more. The recovery algorithm also runs in time proportional to ��@I A. Our algorithms have been implemented and work well in practice; various implementation issues are discussed. Index Terms—Erasure channel, large deviation analysis, lowdensity paritycheck codes. I.
The Power of Two Choices in Randomized Load Balancing
 IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS
, 1996
"... Suppose that n balls are placed into n bins, each ball being placed into a bin chosen independently and uniformly at random. Then, with high probability, the maximum load in any bin is approximately log n log log n . Suppose instead that each ball is placed sequentially into the least full of d ..."
Abstract

Cited by 198 (22 self)
 Add to MetaCart
Suppose that n balls are placed into n bins, each ball being placed into a bin chosen independently and uniformly at random. Then, with high probability, the maximum load in any bin is approximately log n log log n . Suppose instead that each ball is placed sequentially into the least full of d bins chosen independently and uniformly at random. It has recently been shown that the maximum load is then only log log n log d +O(1) with high probability. Thus giving each ball two choices instead of just one leads to an exponential improvement in the maximum load. This result demonstrates the power of two choices, and it has several applications to load balancing in distributed systems. In this thesis, we expand upon this result by examining related models and by developing techniques for stu...
Sudden Emergence Of A Giant kCore In A Random Graph.
 J. Combinatorial Theory, Series B
, 1996
"... The k core of a graph is the largest subgraph with minimum degree at least k . For the ErdosR'enyi random graph G(n; m) on n vertices, with m edges, it is known that a giant 2core grows simultaneously with a giant component, that is when m is close to n=2 . We show that for k 3 , with high proba ..."
Abstract

Cited by 104 (8 self)
 Add to MetaCart
The k core of a graph is the largest subgraph with minimum degree at least k . For the ErdosR'enyi random graph G(n; m) on n vertices, with m edges, it is known that a giant 2core grows simultaneously with a giant component, that is when m is close to n=2 . We show that for k 3 , with high probability, a giant k core appears suddenly when m reaches c k n=2 ; here c k = min ?0 = k () and k () = PfPoisson() k \Gamma 1g . In particular, c 3 3:35 . We also demonstrate that, unlike the 2core, when a k core appears for the first time it is very likely to be giant, of size p k ( k )n . Here k is the minimum point of = k () and p k ( k ) = PfPoisson( k ) kg . For k = 3 , for instance, the newborn 3core contains about 0:27n vertices. Our proofs are based on the probabilistic analysis of an edge deletion algorithm that always finds a k core if the graph has one. 1991 Mathematics Subject Classification. Primary 05C80, 05C85, 60C05; Secondary 60F10, 60G42, 60J10.
The Power of Two Random Choices: A Survey of Techniques and Results
 in Handbook of Randomized Computing
, 2000
"... ITo motivate this survey, we begin with a simple problem that demonstrates a powerful fundamental idea. Suppose that n balls are thrown into n bins, with each ball choosing a bin independently and uniformly at random. Then the maximum load, or the largest number of balls in any bin, is approximately ..."
Abstract

Cited by 98 (2 self)
 Add to MetaCart
ITo motivate this survey, we begin with a simple problem that demonstrates a powerful fundamental idea. Suppose that n balls are thrown into n bins, with each ball choosing a bin independently and uniformly at random. Then the maximum load, or the largest number of balls in any bin, is approximately log n= log log n with high probability. Now suppose instead that the balls are placed sequentially, and each ball is placed in the least loaded of d 2 bins chosen independently and uniformly at random. Azar, Broder, Karlin, and Upfal showed that in this case, the maximum load is log log n= log d + (1) with high probability [ABKU99]. The important implication of this result is that even a small amount of choice can lead to drastically different results in load balancing. Indeed, having just two random choices (i.e.,...
Analysis of Random Processes via AndOr Tree Evaluation
 In Proceedings of the 9th Annual ACMSIAM Symposium on Discrete Algorithms
, 1998
"... We introduce a new set of probabilistic analysis tools based on the analysis of AndOr trees with random inputs. These tools provide a unifying, intuitive, and powerful framework for carrying out the analysis of several previously studied random processes of interest, including random lossresilient ..."
Abstract

Cited by 73 (23 self)
 Add to MetaCart
We introduce a new set of probabilistic analysis tools based on the analysis of AndOr trees with random inputs. These tools provide a unifying, intuitive, and powerful framework for carrying out the analysis of several previously studied random processes of interest, including random lossresilient codes, solving random kSAT formula using the pure literal rule, and the greedy algorithm for matchings in random graphs. In addition, these tools allow generalizations of these problems not previously analyzed to be analyzed in a straightforward manner. We illustrate our methodology on the three problems listed above. 1 Introduction We introduce a new set of probabilistic analysis tools related to the amplification method introduced by [12] and further developed and used in [13, 5]. These tools provide a unifying, intuitive, and powerful framework for carrying out the analysis of several previously studied random processes of interest, including the random lossresilient codes introduced ...
Lower bounds for random 3SAT via differential equations
 THEORETICAL COMPUTER SCIENCE
, 2001
"... ..."
On the Analysis of Randomized Load Balancing Schemes
 IN PROCEEDINGS OF THE 9TH ANNUAL ACM SYMPOSIUM ON PARALLEL ALGORITHMS AND ARCHITECTURES
, 1998
"... It is well known that simple randomized load balancing schemes can balance load effectively while incurring only a small overhead, making such schemes appealing for practical systems. In this paper, we provide new analyses for several such dynamic randomized load balancing schemes. Our work extends ..."
Abstract

Cited by 55 (7 self)
 Add to MetaCart
It is well known that simple randomized load balancing schemes can balance load effectively while incurring only a small overhead, making such schemes appealing for practical systems. In this paper, we provide new analyses for several such dynamic randomized load balancing schemes. Our work extends a previous analysis of the supermarket model, a model that abstracts a simple, efficient load balancing scheme in the setting where jobs arrive at a large system of parallel processors. In this model, customers arrive at a system of n servers as a Poisson stream of rate #n, # < 1, with service requirements exponentially distributed with mean 1. Each customer chooses d servers independently and uniformly at random from the n servers, and is served according to the First In First Out (FIFO) protocol at the choice with the fewest customers. For the supermarket model, it has been shown that using d = 2 choices yields an exponential improvement in the expected time a customer spends in the syst...
Differential equation approximations for Markov chains, manuscript
, 2005
"... We formulate some simple conditions under which a Markov chain may be approximated by the solution to a differential equation, with quantifiable error probabilities. The role of a choice of coordinate functions for the Markov chain is emphasised. The general theory is illustrated in three examples: ..."
Abstract

Cited by 42 (1 self)
 Add to MetaCart
We formulate some simple conditions under which a Markov chain may be approximated by the solution to a differential equation, with quantifiable error probabilities. The role of a choice of coordinate functions for the Markov chain is emphasised. The general theory is illustrated in three examples: the classical stochastic epidemic, a population process model with fast and slow variables, and corefinding algorithms for large random hypergraphs. 1
Maximum matchings in sparse random graphs: KarpSipser revisited
, 1997
"... We study the average performance of a simple greedy algorithm for finding a matching in a sparse random graph G n;c=n , where c ? 0 is constant. The algorithm was first proposed by Karp and Sipser [12]. We give significantly improved estimates of the errors made by the algorithm. For the subcritica ..."
Abstract

Cited by 35 (10 self)
 Add to MetaCart
We study the average performance of a simple greedy algorithm for finding a matching in a sparse random graph G n;c=n , where c ? 0 is constant. The algorithm was first proposed by Karp and Sipser [12]. We give significantly improved estimates of the errors made by the algorithm. For the subcritical case where c ! e we show that the algorithm finds a maximum matching with high probability. If c ? e then with high probability the algorithm produces a matching which is within n 1=5+o(1) of maximum size. 1 Introduction A matching in a graph G = (V; E) is a set of edges in E which are vertex disjoint. A standard problem in algorithmic graph theory is to find the largest possible matching in a graph. The first polynomial time algorithm to solve this problem was devised by Edmonds in 1965 and runs in time O(jV j 4 ) [10]. Over the years, many improvements have been made. Currently the fastest such algorithm is that of Micali and Vazirani which dates back to 1980. Its running time is O(...
Setting 2 variables at a time yields a new lower bound for random 3SAT (Extended Abstract)
 STOC
, 2000
"... Let X be a set of n Boolean variables and denote by C(X) the set of all 3clauses over X, i.e. the set of all 8(3) possible disjunctions of three distinct, noncomplementary literais from variables in X. Let F(n, m) be a random 3SAT formula formed by selecting, with replacement, m clauses uniformly ..."
Abstract

Cited by 34 (4 self)
 Add to MetaCart
Let X be a set of n Boolean variables and denote by C(X) the set of all 3clauses over X, i.e. the set of all 8(3) possible disjunctions of three distinct, noncomplementary literais from variables in X. Let F(n, m) be a random 3SAT formula formed by selecting, with replacement, m clauses uniformly at random from C(X) and taking their conjunction. The satisfiability threshold conjecture asserts that there exists a constant ra such that as n+ c¢, F(n, rn) is satisfiable with probability that tends to 1 if r < ra, but unsatisfiable with probability that tends to 1 if r:> r3. Experimental evidence suggests rz ~ 4.2. We prove rz> 3.145 improving over the previous best lower bound r3> 3.003 due to Frieze and Suen. For this, we introduce a satisfiability heuristic that works iteratively, permanently setting the value of a pair of variables in each round. The framework we develop for the analysis of our heuristic allows us to also derive most previous lower bounds for random 3SAT in a uniform manner and with little effort.