Results 1  10
of
13
Extractors and Pseudorandom Generators
 Journal of the ACM
, 1999
"... We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain. ..."
Abstract

Cited by 87 (5 self)
 Add to MetaCart
We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain.
Extracting all the Randomness and Reducing the Error in Trevisan's Extractors
 In Proceedings of the 31st Annual ACM Symposium on Theory of Computing
, 1999
"... We give explicit constructions of extractors which work for a source of any minentropy on strings of length n. These extractors can extract any constant fraction of the minentropy using O(log² n) additional random bits, and can extract all the minentropy using O(log³ n) additional rando ..."
Abstract

Cited by 78 (16 self)
 Add to MetaCart
We give explicit constructions of extractors which work for a source of any minentropy on strings of length n. These extractors can extract any constant fraction of the minentropy using O(log² n) additional random bits, and can extract all the minentropy using O(log³ n) additional random bits. Both of these constructions use fewer truly random bits than any previous construction which works for all minentropies and extracts a constant fraction of the minentropy. We then improve our second construction and show that we can reduce the entropy loss to 2 log(1=") +O(1) bits, while still using O(log³ n) truly random bits (where entropy loss is defined as [(source minentropy) + (# truly random bits used) (# output bits)], and " is the statistical difference from uniform achieved). This entropy loss is optimal up to a constant additive term. our...
Computing with Very Weak Random Sources
, 1994
"... For any fixed 6> 0, we show how to simulate RP algorithms in time nO(‘Ogn) using the output of a 6source wath minentropy R‘. Such a weak random source is asked once for R bits; it outputs an Rbit string such that any string has probability at most 2Rc. If 6> 1 l/(k + l), our BPP simulations tak ..."
Abstract

Cited by 73 (7 self)
 Add to MetaCart
For any fixed 6> 0, we show how to simulate RP algorithms in time nO(‘Ogn) using the output of a 6source wath minentropy R‘. Such a weak random source is asked once for R bits; it outputs an Rbit string such that any string has probability at most 2Rc. If 6> 1 l/(k + l), our BPP simulations take time no(‘og(k)n) (log(k) is the logarithm iterated k times). We also gave a polynomialtime BPP simulation using ChorGoldreich sources of minentropy Ro(’), which is optimal. We present applications to timespace tradeoffs, expander constructions, and the hardness of approximation. Also of interest is our randomnessefficient Leflover Hash Lemma, found independently by Goldreich & Wigderson.
Almost Optimal Dispersers
 In Proceedings of the 30th Annual ACM Symposium on Theory of Computing
, 1998
"... A (K; ffl) disperser graph G = (V 1 ; V 2 ; E) is a bipartite graph with the property that for any subset A ` V 1 of cardinality K, the neighbors of A cover at least 1 \Gamma ffl fraction of the vertices of V 2 . Such graphs have many applications in derandomization. Saks, Srinivasan and Zhou presen ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
A (K; ffl) disperser graph G = (V 1 ; V 2 ; E) is a bipartite graph with the property that for any subset A ` V 1 of cardinality K, the neighbors of A cover at least 1 \Gamma ffl fraction of the vertices of V 2 . Such graphs have many applications in derandomization. Saks, Srinivasan and Zhou presented an explicit construction of (K = 2 k ; ffl) disperser graphs G = (V = [2 n ]; W;E) with an almost optimal degree D = poly(n; ffl \Gamma1 ), for every k n\Omega\Gamma27 . We extend their result for any parameter k n. 1 Introduction A disperser is a sparse graph with strong randomlike properties. As such, explicit dispersers have numerous applications in derandomization (many of them appearing in the excellent survey paper by Nisan [Nis96]). The question whether explicit constructions of such graphs do exist attracted much research in the last decade [Sip88, Zuc90, Zuc91, NZ93, SZ94, SSZ95, Zuc96]. Saks, Srinivasan and Zhou [SSZ95] showed an almost optimal disperser constructio...
Constructions of NearOptimal Extractors Using PseudoRandom Generators
 Electronic Colloquium on Computational Complexity
, 1998
"... We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction that is simple to describe and analyze, does not use any of the standard techniques used in related results, and improves or subsumes almost all the previous constructions. 1 Introduction Informally defined, an extractor is a function that extracts randomness from a weakly random distribution. Explicit constructions of extractors have several applications and are typically very hard to achieve. In this paper we introduce a new approach to the explicit construction of extractors. Our approach yields a construction that improves most of the known results, and that is optimal for certain parameters. Furthermore, our construction is simple and uses techniques that were never used in this field...
Derandomizing ArthurMerlin Games under Uniform Assumptions
 Computational Complexity
, 2000
"... We study how the nondeterminism versus determinism problem and the time versus space problem are related to the problem of derandomization. In particular, we show two ways of derandomizing the complexity class AM under uniform assumptions, which was only known previously under nonuniform assumption ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
We study how the nondeterminism versus determinism problem and the time versus space problem are related to the problem of derandomization. In particular, we show two ways of derandomizing the complexity class AM under uniform assumptions, which was only known previously under nonuniform assumptions [13, 14]. First, we prove that either AM = NP or it appears to any nondeterministic polynomial time adversary that NP is contained in deterministic subexponential time infinitely often. This implies that to any nondeterministic polynomial time adversary, the graph nonisomorphism problem appears to have subexponentialsize proofs infinitely often, the first nontrivial derandomization of this problem without any assumption. Next, we show that either all BPP = P, AM = NP, and PH P hold, or for any t(n) = 2 n) , DTIME(t(n)) DSPACE(t (n)) infinitely often for any constant > 0. Similar tradeoffs also hold for a whole range of parameters. This improves previous results [17, 5] ...
Uniform Hardness Versus Randomness Tradeoffs For ArthurMerlin Games
, 2003
"... Impagliazzo and Wigderson proved a uniform hardness vs. ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
Impagliazzo and Wigderson proved a uniform hardness vs.
Streaming Computation of Combinatorial Objects
 In Proceedings of the Seventeenth Annual IEEE Conference on Computational Complexity
, 2002
"... We prove (mostly tight) space lower bounds for "streaming " (or "online") computations of four fundamental combinatorial objects: errorcorrecting codes, universal hash functions, extractors, and dispersers. Streaming computations for these objects are motivated algorithmically by massive data set ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
We prove (mostly tight) space lower bounds for "streaming " (or "online") computations of four fundamental combinatorial objects: errorcorrecting codes, universal hash functions, extractors, and dispersers. Streaming computations for these objects are motivated algorithmically by massive data set applications and complexitytheoretically by pseudorandomness and derandomization for spacebounded probabilistic algorithms.
Extracting All the Randomness from a Weakly Random Source
, 1998
"... In this paper, we give two explicit constructions of extractors, both of which work for a source of any minentropy on strings of length n. The first extracts any constant fraction of the minentropy using O(log 2 n) additional random bits. The second extracts all the minentropy using O(log 3 ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In this paper, we give two explicit constructions of extractors, both of which work for a source of any minentropy on strings of length n. The first extracts any constant fraction of the minentropy using O(log 2 n) additional random bits. The second extracts all the minentropy using O(log 3 n) additional random bits. Both constructions use fewer truly random bits than any previous construction which works for all minentropies and extracts a constant fraction of the minentropy. The extractors are obtained by observing that a weaker notion of "combinatorial design" suffices for the NisanWigderson pseudorandom generator [NW94], which underlies the recent extractor of Trevisan [Tre98]. We give nearoptimal constructions of such "weak designs" which achieve much better parameters than possible with the notion of designs used by NisanWigderson and Trevisan. 1 Introduction Roughly speaking, an extractor is a function which extracts truly random bits from a weakly random source,...
An introduction to randomness extractors
"... Abstract. We give an introduction to the area of “randomness extraction” and survey the main concepts of this area: deterministic extractors, seeded extractors and multiple sources extractors. For each one we briefly discuss background, definitions, explicit constructions and applications. 1 ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Abstract. We give an introduction to the area of “randomness extraction” and survey the main concepts of this area: deterministic extractors, seeded extractors and multiple sources extractors. For each one we briefly discuss background, definitions, explicit constructions and applications. 1