Results 1  10
of
110
PseudoRandom Generation from OneWay Functions
 PROC. 20TH STOC
, 1988
"... Pseudorandom generators are fundamental to many theoretical and applied aspects of computing. We show howto construct a pseudorandom generator from any oneway function. Since it is easy to construct a oneway function from a pseudorandom generator, this result shows that there is a pseudorandom gene ..."
Abstract

Cited by 861 (18 self)
 Add to MetaCart
Pseudorandom generators are fundamental to many theoretical and applied aspects of computing. We show howto construct a pseudorandom generator from any oneway function. Since it is easy to construct a oneway function from a pseudorandom generator, this result shows that there is a pseudorandom generator iff there is a oneway function.
Derandomizing Polynomial Identity Tests Means Proving Circuit Lower Bounds (Extended Abstract)
, 2003
"... Since Polynomial Identity Testing is a coRP problem, we obtain the following corollary: If RP = P (or, even, coRP ` "ffl?0NTIME(2nffl), infinitely often), then NEXP is not computable by polynomialsize arithmetic circuits. Thus, establishing that RP = coRP or BPP = P would require proving s ..."
Abstract

Cited by 175 (5 self)
 Add to MetaCart
Since Polynomial Identity Testing is a coRP problem, we obtain the following corollary: If RP = P (or, even, coRP ` &quot;ffl?0NTIME(2nffl), infinitely often), then NEXP is not computable by polynomialsize arithmetic circuits. Thus, establishing that RP = coRP or BPP = P would require proving superpolynomial lower bounds for Boolean or arithmetic circuits. We also show that any derandomization of RNC would yield new circuit lower bounds for a language in NEXP.
Simple Extractors for All MinEntropies and a New PseudoRandom Generator
"... We present a simple, selfcontained extractor construction that produces good extractors for all minentropies (minentropy measures the amount of randomness contained in a weak random source). Our construction is algebraic and builds on a new polynomialbased approach introduced by TaShma, Zuckerm ..."
Abstract

Cited by 111 (27 self)
 Add to MetaCart
We present a simple, selfcontained extractor construction that produces good extractors for all minentropies (minentropy measures the amount of randomness contained in a weak random source). Our construction is algebraic and builds on a new polynomialbased approach introduced by TaShma, Zuckerman, and Safra [37]. Using our improvements, we obtain, for example, an extractor with output length m = k1\Gamma ffi and seed length O(log n). This matches the parameters of Trevisan's breakthrough result [38] and additionally achieves those parameters for smallminentropies k. Extending [38] to small k has been the focus of a sequence of recent works [15, 26, 35]. Our construction gives a much simpler and more direct solution tothis problem. Applying similar ideas to the problem of building pseudorandom generators, we obtain a new pseudorandom generator construction that is not based on the NW generator[21], and turns worstcase hardness directly into pseudorandomness. The parameters of this generator match those in [16, 33] and in particular are strong enough to obtain a new proof that P = BP P if E requires exponential size circuits. Essentially the same construction yields a hitting set generator with optimal seed length that outputs s\Omega (1) bits when given a function that requires circuits of size s (for any s). This implies a hardness versus randomness tradeoff for RP and BP P that is optimal (up to polynomial factors), solving an open problem raised by [14]. Our generators can also be used to derandomize AM in a way that improves and extends the results of [4, 18, 20].
Graph Nonisomorphism Has Subexponential Size Proofs Unless The PolynomialTime Hierarchy Collapses
 SIAM Journal on Computing
, 1998
"... We establish hardness versus randomness tradeoffs for a broad class of randomized procedures. In particular, we create efficient nondeterministic simulations of bounded round ArthurMerlin games using a language in exponential time that cannot be decided by polynomial size oracle circuits with acce ..."
Abstract

Cited by 110 (4 self)
 Add to MetaCart
(Show Context)
We establish hardness versus randomness tradeoffs for a broad class of randomized procedures. In particular, we create efficient nondeterministic simulations of bounded round ArthurMerlin games using a language in exponential time that cannot be decided by polynomial size oracle circuits with access to satisfiability. We show that every language with a bounded round ArthurMerlin game has subexponential size membership proofs for infinitely many input lengths unless exponential time coincides with the third level of the polynomialtime hierarchy (and hence the polynomialtime hierarchy collapses). This provides the first strong evidence that graph nonisomorphism has subexponential size proofs. We set up a general framework for derandomization which encompasses more than the traditional model of randomized computation. For a randomized procedure to fit within this framework, we only require that for any fixed input the complexity of checking whether the procedure succeeds on a given ...
Extractors and Pseudorandom Generators
 Journal of the ACM
, 1999
"... We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain. ..."
Abstract

Cited by 104 (6 self)
 Add to MetaCart
(Show Context)
We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain.
Randomness vs. Time: Derandomization under a uniform assumption
"... We prove that if BPP � = EXP, then every problem in BPP can be solved deterministically in subexponential time on almost every input ( on every samplable ensemble for infinitely many input sizes). This is the first derandomization result for BP P based on uniform, noncryptographic hardness assumptio ..."
Abstract

Cited by 72 (11 self)
 Add to MetaCart
(Show Context)
We prove that if BPP � = EXP, then every problem in BPP can be solved deterministically in subexponential time on almost every input ( on every samplable ensemble for infinitely many input sizes). This is the first derandomization result for BP P based on uniform, noncryptographic hardness assumptions. It implies the following gap in the averageinstance complexities of problems in BP P: either these complexities are always subexponential or they contain arbitrarily large exponential functions. We use a construction of a small “pseudorandom” set of strings from a “hard function” in EXP which is identical to that used in the analogous nonuniform results of [21, 3]. However, previous proofs of correctness assume the “hard function ” is not in P/poly. They give a nonconstructive argument that a circuit distinguishing the pseudorandom strings from truly random strings implies that a similarlysized circuit exists computing the “hard function”. Our main technical contribution is to show that, if the “hard function ” has certain properties, then this argument can be made constructive. We then show that, assuming EXP ⊆ P/poly, there are EXPcomplete functions with these properties.
Some Applications of Coding Theory in Computational Complexity
, 2004
"... Errorcorrecting codes and related combinatorial constructs play an important role in several recent (and old) results in computational complexity theory. In this paper we survey results on locallytestable and locallydecodable errorcorrecting codes, and their applications to complexity theory ..."
Abstract

Cited by 65 (2 self)
 Add to MetaCart
(Show Context)
Errorcorrecting codes and related combinatorial constructs play an important role in several recent (and old) results in computational complexity theory. In this paper we survey results on locallytestable and locallydecodable errorcorrecting codes, and their applications to complexity theory and to cryptography.
Pseudorandomness for Network Algorithms
 In Proceedings of the 26th Annual ACM Symposium on Theory of Computing
, 1994
"... We define pseudorandom generators for Yao's twoparty communication complexity model and exhibit a simple construction, based on expanders, for it. We then use a recursive composition of such generators to obtain pseudorandom generators that fool distributed network algorithms. While the constru ..."
Abstract

Cited by 53 (5 self)
 Add to MetaCart
We define pseudorandom generators for Yao's twoparty communication complexity model and exhibit a simple construction, based on expanders, for it. We then use a recursive composition of such generators to obtain pseudorandom generators that fool distributed network algorithms. While the construction and the proofs are simple, we demonstrate the generality of such generators by giving several applications. 1 Introduction The theory of pseudorandomness is aimed at understanding the minimum amount of randomness that a probabilistic model of computation actually needs. A typical result shows that n truly random bits used by the model can be replaced by n pseudorandom ones, generated deterministically from m !! n random bits, without significant difference in the behavior of the model. The deterministic function stretching the m random bits into n pseudorandom ones is called a pseudorandom generator, which is said to fool the Dept. of Computer Science, UCSD. Supported by USAIsrael BSF gra...
Pseudorandomness and averagecase complexity via uniform reductions
 IN PROCEEDINGS OF THE 17TH ANNUAL IEEE CONFERENCE ON COMPUTATIONAL COMPLEXITY
, 2002
"... Impagliazzo and Wigderson (36th FOCS, 1998) gave the first construction of pseudorandom generators from a uniform complexity assumption on EXP (namely EXP � = BPP). Unlike results in the nonuniform setting, their result does not provide a continuous tradeoff between worstcase hardness and pseudor ..."
Abstract

Cited by 51 (7 self)
 Add to MetaCart
Impagliazzo and Wigderson (36th FOCS, 1998) gave the first construction of pseudorandom generators from a uniform complexity assumption on EXP (namely EXP � = BPP). Unlike results in the nonuniform setting, their result does not provide a continuous tradeoff between worstcase hardness and pseudorandomness, nor does it explicitly establish an averagecase hardness result. In this paper: ◦ We obtain an optimal worstcase to averagecase connection for EXP: if EXP � ⊆ BPTIME(t(n)), then EXP has problems that cannot be solved on a fraction 1/2 + 1/t ′ (n) of the inputs by BPTIME(t ′ (n)) algorithms, for t ′ = t Ω(1). ◦ We exhibit a PSPACEcomplete selfcorrectible and downward selfreducible problem. This slightly simplifies and strengthens the proof of Impagliazzo and Wigderson, which used a #Pcomplete problem with these properties. ◦ We argue that the results of Impagliazzo and Wigderson, and the ones in this paper, cannot be proved via “blackbox” uniform reductions.