Results 1  10
of
41
Simple Extractors for All MinEntropies and a New PseudoRandom Generator
"... We present a simple, selfcontained extractor construction that produces good extractors for all minentropies (minentropy measures the amount of randomness contained in a weak random source). Our construction is algebraic and builds on a new polynomialbased approach introduced by TaShma, Zuckerm ..."
Abstract

Cited by 111 (27 self)
 Add to MetaCart
We present a simple, selfcontained extractor construction that produces good extractors for all minentropies (minentropy measures the amount of randomness contained in a weak random source). Our construction is algebraic and builds on a new polynomialbased approach introduced by TaShma, Zuckerman, and Safra [37]. Using our improvements, we obtain, for example, an extractor with output length m = k1\Gamma ffi and seed length O(log n). This matches the parameters of Trevisan's breakthrough result [38] and additionally achieves those parameters for smallminentropies k. Extending [38] to small k has been the focus of a sequence of recent works [15, 26, 35]. Our construction gives a much simpler and more direct solution tothis problem. Applying similar ideas to the problem of building pseudorandom generators, we obtain a new pseudorandom generator construction that is not based on the NW generator[21], and turns worstcase hardness directly into pseudorandomness. The parameters of this generator match those in [16, 33] and in particular are strong enough to obtain a new proof that P = BP P if E requires exponential size circuits. Essentially the same construction yields a hitting set generator with optimal seed length that outputs s\Omega (1) bits when given a function that requires circuits of size s (for any s). This implies a hardness versus randomness tradeoff for RP and BP P that is optimal (up to polynomial factors), solving an open problem raised by [14]. Our generators can also be used to derandomize AM in a way that improves and extends the results of [4, 18, 20].
Extractors and Pseudorandom Generators
 Journal of the ACM
, 1999
"... We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain. ..."
Abstract

Cited by 104 (6 self)
 Add to MetaCart
(Show Context)
We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain.
Extracting Randomness: A Survey and New Constructions
, 1999
"... this paper we do two things. First, we survey extractors and dispersers: what they are, how they can be designed, and some of their applications. The work described in the survey is due to a long list of research papers by various authors##most notably by David Zuckerman. Then, we present a new tool ..."
Abstract

Cited by 85 (3 self)
 Add to MetaCart
this paper we do two things. First, we survey extractors and dispersers: what they are, how they can be designed, and some of their applications. The work described in the survey is due to a long list of research papers by various authors##most notably by David Zuckerman. Then, we present a new tool for constructing explicit extractors and give two new constructions that greatly improve upon previous results. The new tool we devise, a merger," is a function that accepts d strings, one of which is uniformly distributed and outputs a single string that is guaranteed to be uniformly distributed. We show how to build good explicit mergers, and how mergers can be used to build better extractors. Using this, we present two new constructions. The first construction succeeds in extracting all of the randomness from any somewhat random source. This improves upon previous extractors that extract only some of the randomness from somewhat random sources with enough" randomness. The amount of truly random bits used by this extractor, however, is not optimal. The second extractor we build extracts only some of the randomness and works only for sources with enough randomness, but uses a nearoptimal amount of truly random bits. Extractors and dispersers have many applications in removing randomness" in various settings and in making randomized constructions explicit. We survey some of these applications and note whenever our new constructions yield better results, e.g., plugging our new extractors into a previous construction we achieve the first explicit Nsuperconcentrators of linear size and polyloglog(N) depth. ] 1999 Academic Press CONTENTS 1.
Circuit Minimization Problem
 In ACM Symposium on Theory of Computing (STOC
, 1999
"... We study the complexity of the circuit minimization problem: given the truth table of a Boolean function f and a parameter s, decide whether f can be realized by a Boolean circuit of size at most s. We argue why this problem is unlikely to be in P (or even in P=poly) by giving a number of surpris ..."
Abstract

Cited by 34 (4 self)
 Add to MetaCart
(Show Context)
We study the complexity of the circuit minimization problem: given the truth table of a Boolean function f and a parameter s, decide whether f can be realized by a Boolean circuit of size at most s. We argue why this problem is unlikely to be in P (or even in P=poly) by giving a number of surprising consequences of such an assumption. We also argue that proving this problem to be NPcomplete (if it is indeed true) would imply proving strong circuit lower bounds for the class E, which appears beyond the currently known techniques. Keywords: hard Boolean functions, derandomization, natural properties, NPcompleteness. 1 Introduction An nvariable Boolean function f n : f0; 1g n ! f0; 1g can be given by either its truth table of size 2 n , or a Boolean circuit whose size may be significantly smaller than 2 n . It is well known that most Boolean functions on n variables have circuit complexity at least 2 n =n [Sha49], but so far no family of sufficiently hard functions has ...
Constructions of NearOptimal Extractors Using PseudoRandom Generators
 Electronic Colloquium on Computational Complexity
, 1998
"... We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
(Show Context)
We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction that is simple to describe and analyze, does not use any of the standard techniques used in related results, and improves or subsumes almost all the previous constructions. 1 Introduction Informally defined, an extractor is a function that extracts randomness from a weakly random distribution. Explicit constructions of extractors have several applications and are typically very hard to achieve. In this paper we introduce a new approach to the explicit construction of extractors. Our approach yields a construction that improves most of the known results, and that is optimal for certain parameters. Furthermore, our construction is simple and uses techniques that were never used in this field...
Derandomization: a brief overview
 Bulletin of the EATCS
"... This survey focuses on the recent (1998–2003) developments in the area of derandomization, with the emphasis on the derandomization of timebounded randomized complexity classes. 1 ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
This survey focuses on the recent (1998–2003) developments in the area of derandomization, with the emphasis on the derandomization of timebounded randomized complexity classes. 1
Comparing Notions of Full Derandomization
 IN PROCEEDINGS OF THE SIXTEENTH ANNUAL IEEE CONFERENCE ON COMPUTATIONAL COMPLEXITY
, 2001
"... Most of the hypotheses of full derandomization fall into two sets of equivalent statements: Those equivalent to the existence of efficient pseudorandom generators and those equivalent to approximating the accepting probability of a circuit. We give the first relativized world where these sets of equ ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
Most of the hypotheses of full derandomization fall into two sets of equivalent statements: Those equivalent to the existence of efficient pseudorandom generators and those equivalent to approximating the accepting probability of a circuit. We give the first relativized world where these sets of equivalent statements are not equivalent to each other.
Explicit ordispersers with polylogarithmic degree
 J. ACM
, 1998
"... An (N,M,T)ORdisperser is a bipartite multigraph G = (V,W,E) withV  = N, and W  = M, having the following expansion property: any subset of V having at least T vertices has a neighbor set of size at least M/2. For any pair of constants ξ,λ,1 ≥ ξ>λ ≥ 0, any sufficiently large N, andforany ( ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
(Show Context)
An (N,M,T)ORdisperser is a bipartite multigraph G = (V,W,E) withV  = N, and W  = M, having the following expansion property: any subset of V having at least T vertices has a neighbor set of size at least M/2. For any pair of constants ξ,λ,1 ≥ ξ>λ ≥ 0, any sufficiently large N, andforany (log N)ξ (log N)λ T ≥ 2, M ≤ 2, we give an explicit elementary construction of an (N,M,T)ORdisperser such that the outdegree of any vertex in V is at most polylogarithmic in N. Using this with known applications of ORdispersers yields several results. First, our construction implies that the complexity class StrongRP defined by Sipser, equals RP. Second, for any fixed η>0, we give the first polynomialtime simulation of RP algorithms using the output of any “ηminimally random ” source. For any integral R>0, such a source accepts a single request for an Rbit string and generates the string according to a distribution that assigns probability at most 2−Rη to any string. It is minimally random in the sense that any weaker source is
Improved derandomization of BPP using a hitting set generator
 Proceedings of Random99, LNCS 1671
, 1999
"... A hittingset generator is a deterministic algorithm which generates a set of strings that intersects every dense set recognizable by a small circuit. A polynomial time hittingset generator readily implies RP = P . Andreev et. al. (ICALP'96, and JACM 1998) showed that if polynomialtime hit ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
A hittingset generator is a deterministic algorithm which generates a set of strings that intersects every dense set recognizable by a small circuit. A polynomial time hittingset generator readily implies RP = P . Andreev et. al. (ICALP'96, and JACM 1998) showed that if polynomialtime hittingset generator in fact implies the much stronger conclusion BPP = P .