Results 1  10
of
85
A fast and simple randomized parallel algorithm for the maximal . . .
, 1986
"... A simple parallel randomized algorithm to find a maximal independent set in a graph G = (V, E) on n vertices is presented. Its expected running time on a concurrentread concurrentwrite PRAM with 0 ( 1 E 1 d,,) processors is O(log n), where d,, denotes the maximum degree. On an exclusiveread exclu ..."
Abstract

Cited by 52 (0 self)
 Add to MetaCart
A simple parallel randomized algorithm to find a maximal independent set in a graph G = (V, E) on n vertices is presented. Its expected running time on a concurrentread concurrentwrite PRAM with 0 ( 1 E 1 d,,) processors is O(log n), where d,, denotes the maximum degree. On an exclusiveread exclusivewrite PRAM with 0 ( 1 El) processors the algorithm runs in O(log2n). Previously, an O(log4n) deterministic algorithm was given by Karp and Wigderson for the EREWPRAM model. This was recently (independently of our work) improved to O(log2n) by M. Luby. In both cases randomized algorithms depending on painvise independent choices were turned into deterministic algorithms. We comment on how randomized combinatorial algorithms whose analysis only depends on dwise rather than fully independent random choices (for some constant d) can be converted into deterministic algorithms. We apply a technique due to A. Joffe (1974) and obtain deterministic construction in fast parallel time of various combinatorial objects whose existence follows from probabilistic arguments.
Extracting randomness using few independent sources
 In Proceedings of the 45th Annual IEEE Symposium on Foundations of Computer Science
, 2004
"... In this work we give the first deterministic extractors from a constant number of weak sources whose entropy rate is less than 1/2. Specifically, for every δ> 0 we give an explicit construction for extracting randomness from a constant (depending polynomially on 1/δ) number of distributions over {0, ..."
Abstract

Cited by 49 (6 self)
 Add to MetaCart
In this work we give the first deterministic extractors from a constant number of weak sources whose entropy rate is less than 1/2. Specifically, for every δ> 0 we give an explicit construction for extracting randomness from a constant (depending polynomially on 1/δ) number of distributions over {0, 1} n, each having minentropy δn. These extractors output n bits, which are 2 −n close to uniform. This construction uses several results from additive number theory, and in particular a recent one by Bourgain, Katz and Tao [BKT03] and of Konyagin [Kon03]. We also consider the related problem of constructing randomness dispersers. For any constant output length m, our dispersers use a constant number of identical distributions, each with minentropy Ω(log n) and outputs every possible mbit string with positive probability. The main tool we use is a variant of the “steppingup lemma ” used in establishing lower bound
Approximating HyperRectangles: Learning and Pseudorandom Sets
 Journal of Computer and System Sciences
, 1997
"... The PAC learning of rectangles has been studied because they have been found experimentally to yield excellent hypotheses for several applied learning problems. Also, pseudorandom sets for rectangles have been actively studied recently because (i) they are a subproblem common to the derandomization ..."
Abstract

Cited by 44 (3 self)
 Add to MetaCart
The PAC learning of rectangles has been studied because they have been found experimentally to yield excellent hypotheses for several applied learning problems. Also, pseudorandom sets for rectangles have been actively studied recently because (i) they are a subproblem common to the derandomization of depth2 (DNF) circuits and derandomizing Randomized Logspace, and (ii) they approximate the distribution of n independent multivalued random variables. We present improved upper bounds for a class of such problems of "approximating" highdimensional rectangles that arise in PAC learning and pseudorandomness. Key words and phrases. Rectangles, machine learning, PAC learning, derandomization, pseudorandomness, multipleinstance learning, explicit constructions, Ramsey graphs, random graphs, sample complexity, approximations of distributions. 2 1 Introduction A basic common theme of a large part of PAC learning and derandomization/computational pseudorandomness is to "approximate" a stru...
Simulating Independence: New Constructions of Condensers, Ramsey Graphs, Dispersers, and Extractors
 In Proceedings of the 37th Annual ACM Symposium on Theory of Computing
, 2005
"... We present new explicit constructions of deterministic randomness extractors, dispersers and related objects. More precisely, a distribution X over binary strings of length n is called a δsource if it assigns probability at most 2 −δn to any string of length n, and for any δ> 0 we construct the fol ..."
Abstract

Cited by 43 (13 self)
 Add to MetaCart
We present new explicit constructions of deterministic randomness extractors, dispersers and related objects. More precisely, a distribution X over binary strings of length n is called a δsource if it assigns probability at most 2 −δn to any string of length n, and for any δ> 0 we construct the following poly(n)time computable functions: 2source disperser: D: ({0, 1} n) 2 → {0, 1} such that for any two independent δsources X1, X2 we have that the support of D(X1, X2) is {0, 1}. Bipartite Ramsey graph: Let N = 2 n. A corollary is that the function D is a 2coloring of the edges of KN,N (the complete bipartite graph over two sets of N vertices) such that any induced subgraph of size N δ by N δ is not monochromatic. 3source extractor: E: ({0, 1} n) 2 → {0, 1} such that for any three independent δsources X1, X2, X3 we have that E(X1, X2, X3) is (o(1)close to being) an unbiased random bit. No previous explicit construction was known for either of these, for any δ < 1/2 and these results constitute major progress to longstanding open problems. A component in these results is a new construction of condensers that may be of independent
Randomized graph products, chromatic numbers, and the Lovász thetafunction
 Combinatorica
, 1996
"... For a graph G, let ff(G) denote the size of the largest independent set in G, and let #(G) denote the Lov'asz #function on G. We prove that for some c ? 0, there exists an infinite family of graphs such that #(G) ? ff(G)n=2 c p log n , where n denotes the number of vertices in a graph. This ..."
Abstract

Cited by 41 (6 self)
 Add to MetaCart
For a graph G, let ff(G) denote the size of the largest independent set in G, and let #(G) denote the Lov'asz #function on G. We prove that for some c ? 0, there exists an infinite family of graphs such that #(G) ? ff(G)n=2 c p log n , where n denotes the number of vertices in a graph. This disproves a known conjecture regarding the # function. As part of our proof, we analyse the behavior of the chromatic number in graphs under a randomized version of graph products. This analysis extends earlier work of Linial and Vazirani, and of Berman and Schnitger, and may be of independent interest. 1 Introduction Lov'asz [21] introduced the # function in order to study the so called "Shannon Capacity" of graphs. For every graph G, the # function enjoys the following sandwich property: ff(G) #(G) (G) where ff(G) is the size of the largest independent set in G, and (G) is the clique cover number of G ((G) = ( G), the chromatic number of the complement of G). This sandwich prop...
Superpolynomial Size SetSystems with Restricted Intersections mod 6 and Explicit Ramsey Graphs
 Combinatorica
, 1999
"... We construct a system H of exp(c log 2 n= log log n) subsets of a set of n elements such that the size of each set is divisible by 6 but their pairwise intersections are not divisible by 6. The result generalizes to all nonprimepower moduli m in place of m = 6. This result is in sharp contrast w ..."
Abstract

Cited by 35 (5 self)
 Add to MetaCart
We construct a system H of exp(c log 2 n= log log n) subsets of a set of n elements such that the size of each set is divisible by 6 but their pairwise intersections are not divisible by 6. The result generalizes to all nonprimepower moduli m in place of m = 6. This result is in sharp contrast with results of Frankl and Wilson (1981) for prime power moduli and gives strong negative answers to questions by Frankl and Wilson (1981) and Babai and Frankl (1992). We use our setsystem H to give an explicit Ramseygraph construction, reproducing the logarithmic order of magnitude of the best previously known construction due to Frankl and Wilson (1981). Our construction uses certain mod m polynomials, discovered by Barrington, Beigel and Rudich (1994). 1 Introduction Generalizing the RayChaudhuriWilson theorem [8], Frankl and Wilson [6] proved the following intersection theorem, one of the most important results in extremal set theory: Department of Computer Science, Eotvos Un...
Exponential separation of quantum and classical oneway communication complexity
 SIAM J. Comput
"... Abstract. We give the first exponential separation between quantum and boundederror randomized oneway communication complexity. Specifically, we define the Hidden Matching Problem HMn: Alice gets as input a string x ∈ {0, 1} n and Bob gets a perfect matching M on the n coordinates. Bob’s goal is t ..."
Abstract

Cited by 35 (2 self)
 Add to MetaCart
Abstract. We give the first exponential separation between quantum and boundederror randomized oneway communication complexity. Specifically, we define the Hidden Matching Problem HMn: Alice gets as input a string x ∈ {0, 1} n and Bob gets a perfect matching M on the n coordinates. Bob’s goal is to output a tuple 〈i, j, b 〉 such that the edge (i, j) belongs to the matching M and b = xi ⊕ xj. We prove that the quantum oneway communication complexity of HMn is O(log n), yet any randomized oneway protocol with bounded error must use Ω ( √ n) bits of communication. No asymptotic gap for oneway communication was previously known. Our bounds also hold in the model of Simultaneous Messages (SM) and hence we provide the first exponential separation between quantum SM and randomized SM with public coins. For a Boolean decision version of HMn, we show that the quantum oneway communication complexity remains O(log n) and that the 0error randomized oneway communication complexity is Ω(n). We prove that any randomized linear oneway protocol with bounded error for this problem requires Ω ( 3 √ n log n) bits of communication. Key words. Communication complexity, quantum computation, separation, hidden matching AMS subject classifications. 68P30,68Q15,68Q17,81P68 1. Introduction. The
2source dispersers for subpolynomial entropy and Ramsey graphs beating the FranklWilson construction
 Proceedings of STOC06
, 2006
"... The main result of this paper is an explicit disperser for two independent sources on n bits, each of entropy k = n o(1). Put differently, setting N = 2 n and K = 2 k, we construct explicit N × N Boolean matrices for which no K × K submatrix is monochromatic. Viewed as adjacency matrices of bipartit ..."
Abstract

Cited by 27 (6 self)
 Add to MetaCart
The main result of this paper is an explicit disperser for two independent sources on n bits, each of entropy k = n o(1). Put differently, setting N = 2 n and K = 2 k, we construct explicit N × N Boolean matrices for which no K × K submatrix is monochromatic. Viewed as adjacency matrices of bipartite graphs, this gives an explicit construction of KRamsey bipartite graphs of size N. This greatly improves the previous bound of k = o(n) of Barak, Kindler, Shaltiel, Sudakov and Wigderson [4]. It also significantly improves the 25year record of k = Õ( √ n) on the special case of Ramsey graphs, due to Frankl and Wilson [9]. The construction uses (besides ”classical ” extractor ideas) almost all of the machinery developed in the last couple of years for extraction from independent sources, including: • Bourgain’s extractor for 2 independent sources of some entropy rate < 1/2 [5] • Raz’s extractor for 2 independent sources, one of which has any entropy rate> 1/2 [18] • Rao’s extractor for 2 independent blocksources of entropy n Ω(1) [17]
The Shannon Capacity of a union
 Combinatorica
, 1998
"... For an undirected graph G = (V; E), let G n denote the graph whose vertex set is V n in which two distinct vertices (u 1 ; u 2 ; : : : ; un ) and (v 1 ; v 2 ; : : : ; v n ) are adjacent iff for all i between 1 and n either u i = v i or u i v i 2 E. The Shannon capacity c(G) of G is the limit li ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
For an undirected graph G = (V; E), let G n denote the graph whose vertex set is V n in which two distinct vertices (u 1 ; u 2 ; : : : ; un ) and (v 1 ; v 2 ; : : : ; v n ) are adjacent iff for all i between 1 and n either u i = v i or u i v i 2 E. The Shannon capacity c(G) of G is the limit limn7!1 (ff(G n )) 1=n , where ff(G n ) is the maximum size of an independent set of vertices in G n . We show that there are graphs G and H such that the Shannon capacity of their disjoint union is (much) bigger than the sum of their capacities. This disproves a conjecture of Shannon raised in 1956. 1 Introduction For an undirected graph G = (V; E), let G n denote the graph whose vertex set is V n in which two distinct vertices (u 1 ; u 2 ; : : : ; u n ) and (v 1 ; v 2 ; : : : ; v n ) are adjacent iff for all i between 1 and n either u i = v i or u i v i 2 E. The Shannon capacity c(G) of G is the limit lim n7!1 (ff(G n )) 1=n , where ff(G n ) is the maximum size of an inde...