Results 1  10
of
111
Probabilistic checking of proofs: a new characterization of NP
 Journal of the ACM
, 1998
"... Abstract. We give a new characterization of NP: the class NP contains exactly those languages L for which membership proofs (a proof that an input x is in L) can be verified probabilistically in polynomial time using logarithmic number of random bits and by reading sublogarithmic number of bits from ..."
Abstract

Cited by 368 (28 self)
 Add to MetaCart
Abstract. We give a new characterization of NP: the class NP contains exactly those languages L for which membership proofs (a proof that an input x is in L) can be verified probabilistically in polynomial time using logarithmic number of random bits and by reading sublogarithmic number of bits from the proof. We discuss implications of this characterization; specifically, we show that approximating Clique and Independent Set, even in a very weak sense, is NPhard.
Randomness is Linear in Space
 Journal of Computer and System Sciences
, 1993
"... We show that any randomized algorithm that runs in space S and time T and uses poly(S) random bits can be simulated using only O(S) random bits in space S and time T poly(S). A deterministic simulation in space S follows. Of independent interest is our main technical tool: a procedure which extracts ..."
Abstract

Cited by 225 (18 self)
 Add to MetaCart
We show that any randomized algorithm that runs in space S and time T and uses poly(S) random bits can be simulated using only O(S) random bits in space S and time T poly(S). A deterministic simulation in space S follows. Of independent interest is our main technical tool: a procedure which extracts randomness from a defective random source using a small additional number of truly random bits. 1
Zero Knowledge and the Chromatic Number
 Journal of Computer and System Sciences
, 1996
"... We present a new technique, inspired by zeroknowledge proof systems, for proving lower bounds on approximating the chromatic number of a graph. To illustrate this technique we present simple reductions from max3coloring and max3sat, showing that it is hard to approximate the chromatic number wi ..."
Abstract

Cited by 183 (8 self)
 Add to MetaCart
(Show Context)
We present a new technique, inspired by zeroknowledge proof systems, for proving lower bounds on approximating the chromatic number of a graph. To illustrate this technique we present simple reductions from max3coloring and max3sat, showing that it is hard to approximate the chromatic number within \Omega\Gamma N ffi ), for some ffi ? 0. We then apply our technique in conjunction with the probabilistically checkable proofs of Hastad, and show that it is hard to approximate the chromatic number to within\Omega\Gamma N 1\Gammaffl ) for any ffl ? 0, assuming NP 6` ZPP. Here, ZPP denotes the class of languages decidable by a random expected polynomialtime algorithm that makes no errors. Our result matches (up to low order terms) the known gap for approximating the size of the largest independent set. Previous O(N ffi ) gaps for approximating the chromatic number (such as those by Lund and Yannakakis, and by Furer) did not match the gap for independent set, and do not extend...
Pseudorandom generators without the XOR Lemma
, 1998
"... Madhu Sudan y Luca Trevisan z Salil Vadhan x Abstract Impagliazzo and Wigderson [IW97] have recently shown that if there exists a decision problem solvable in time 2 O(n) and having circuit complexity 2 n) (for all but finitely many n) then P = BPP. This result is a culmination of a serie ..."
Abstract

Cited by 128 (20 self)
 Add to MetaCart
Madhu Sudan y Luca Trevisan z Salil Vadhan x Abstract Impagliazzo and Wigderson [IW97] have recently shown that if there exists a decision problem solvable in time 2 O(n) and having circuit complexity 2 n) (for all but finitely many n) then P = BPP. This result is a culmination of a series of works showing connections between the existence of hard predicates and the existence of good pseudorandom generators. The construction of Impagliazzo and Wigderson goes through three phases of "hardness amplification" (a multivariate polynomial encoding, a first derandomized XOR Lemma, and a second derandomized XOR Lemma) that are composed with the Nisan Wigderson [NW94] generator. In this paper we present two different approaches to proving the main result of Impagliazzo and Wigderson. In developing each approach, we introduce new techniques and prove new results that could be useful in future improvements and/or applications of hardnessrandomness tradeoffs. Our first result is that when (a modified version of) the NisanWigderson generator construction is applied with a "mildly" hard predicate, the result is a generator that produces a distribution indistinguishable from having large minentropy. An extractor can then be used to produce a distribution computationally indistinguishable from uniform. This is the first construction of a pseudorandom generator that works with a mildly hard predicate without doing hardness amplification. We then show that in the ImpagliazzoWigderson construction only the first hardnessamplification phase (encoding with multivariate polynomial) is necessary, since it already gives the required averagecase hardness. We prove this result by (i) establishing a connection between the hardnessamplification problem and a listdecoding...
Modern cryptography, probabilistic proofs and pseudorandomness, volume 17 of Algorithms and Combinatorics
, 1999
"... all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that new copies bear this notice and the full citation on the first page. Abstracting with credit is permitted. IIPreface You can start by put ..."
Abstract

Cited by 124 (11 self)
 Add to MetaCart
all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that new copies bear this notice and the full citation on the first page. Abstracting with credit is permitted. IIPreface You can start by putting the do not disturb sign. Cay, in Desert Hearts (1985). The interplay between randomness and computation is one of the most fascinating scientific phenomena uncovered in the last couple of decades. This interplay is at the heart of modern cryptography and plays a fundamental role in complexity theory at large. Specifically, the interplay of randomness and computation is pivotal to several intriguing notions of probabilistic proof systems and is the focal of the computational approach to randomness. This book provides an introduction to these three, somewhat interwoven domains (i.e., cryptography, proofs and randomness). Modern Cryptography. Whereas classical cryptography was confined to
A.: ChernoffHoeffding bounds for applications with limited independence
 SIAM J. Discret. Math
, 1995
"... ..."
(Show Context)
Lossless condensers, unbalanced expanders, and extractors
 In Proceedings of the 33rd Annual ACM Symposium on Theory of Computing
, 2001
"... Abstract Trevisan showed that many pseudorandom generator constructions give rise to constructionsof explicit extractors. We show how to use such constructions to obtain explicit lossless condensers. A lossless condenser is a probabilistic map using only O(log n) additional random bitsthat maps n bi ..."
Abstract

Cited by 90 (19 self)
 Add to MetaCart
(Show Context)
Abstract Trevisan showed that many pseudorandom generator constructions give rise to constructionsof explicit extractors. We show how to use such constructions to obtain explicit lossless condensers. A lossless condenser is a probabilistic map using only O(log n) additional random bitsthat maps n bits strings to poly(log K) bit strings, such that any source with support size Kis mapped almost injectively to the smaller domain. Our construction remains the best lossless condenser to date.By composing our condenser with previous extractors, we obtain new, improved extractors. For small enough minentropies our extractors can output all of the randomness with only O(log n) bits. We also obtain a new disperser that works for every entropy loss, uses an O(log n)bit seed, and has only O(log n) entropy loss. This is the best disperser construction to date,and yields other applications. Finally, our lossless condenser can be viewed as an unbalanced
Expanders that Beat the Eigenvalue Bound: Explicit Construction and Applications
 Combinatorica
, 1993
"... For every n and 0 ! ffi ! 1, we construct graphs on n nodes such that every two sets of size n ffi share an edge, having essentially optimal maximum degree n 1\Gammaffi+o(1) . Using known and new reductions from these graphs, we explicitly construct: 1. A k round sorting algorithm using n 1+1=k ..."
Abstract

Cited by 90 (26 self)
 Add to MetaCart
For every n and 0 ! ffi ! 1, we construct graphs on n nodes such that every two sets of size n ffi share an edge, having essentially optimal maximum degree n 1\Gammaffi+o(1) . Using known and new reductions from these graphs, we explicitly construct: 1. A k round sorting algorithm using n 1+1=k+o(1) comparisons. 2. A k round selection algorithm using n 1+1=(2 k \Gamma1)+o(1) comparisons. 3. A depth 2 superconcentrator of size n 1+o(1) . 4. A depth k widesense nonblocking generalized connector of size n 1+1=k+o(1) . All of these results improve on previous constructions by factors of n\Omega\Gamma37 , and are optimal to within factors of n o(1) . These results are based on an improvement to the extractor construction of Nisan & Zuckerman: our algorithm extracts an asymptotically optimal number of random bits from a defective random source using a small additional number of truly random bits. 1
Extracting Randomness: A Survey and New Constructions
, 1999
"... this paper we do two things. First, we survey extractors and dispersers: what they are, how they can be designed, and some of their applications. The work described in the survey is due to a long list of research papers by various authors##most notably by David Zuckerman. Then, we present a new tool ..."
Abstract

Cited by 88 (4 self)
 Add to MetaCart
this paper we do two things. First, we survey extractors and dispersers: what they are, how they can be designed, and some of their applications. The work described in the survey is due to a long list of research papers by various authors##most notably by David Zuckerman. Then, we present a new tool for constructing explicit extractors and give two new constructions that greatly improve upon previous results. The new tool we devise, a merger," is a function that accepts d strings, one of which is uniformly distributed and outputs a single string that is guaranteed to be uniformly distributed. We show how to build good explicit mergers, and how mergers can be used to build better extractors. Using this, we present two new constructions. The first construction succeeds in extracting all of the randomness from any somewhat random source. This improves upon previous extractors that extract only some of the randomness from somewhat random sources with enough" randomness. The amount of truly random bits used by this extractor, however, is not optimal. The second extractor we build extracts only some of the randomness and works only for sources with enough randomness, but uses a nearoptimal amount of truly random bits. Extractors and dispersers have many applications in removing randomness" in various settings and in making randomized constructions explicit. We survey some of these applications and note whenever our new constructions yield better results, e.g., plugging our new extractors into a previous construction we achieve the first explicit Nsuperconcentrators of linear size and polyloglog(N) depth. ] 1999 Academic Press CONTENTS 1.