Results 1  10
of
26
Pseudorandom generators for spacebounded computation
 Combinatorica
, 1992
"... Pseudorandom generators are constructed which convert O(SlogR) truly random bits to R bits that appear random to any algorithm that runs in SPACE(S). In particular, any randomized polynomial time algorithm that runs in space S can be simulated using only O(Slogn) random bits. An application of these ..."
Abstract

Cited by 184 (10 self)
 Add to MetaCart
Pseudorandom generators are constructed which convert O(SlogR) truly random bits to R bits that appear random to any algorithm that runs in SPACE(S). In particular, any randomized polynomial time algorithm that runs in space S can be simulated using only O(Slogn) random bits. An application of these generators is an explicit construction of universal traversal sequences (for arbitrary graphs) of length n O(l~ The generators constructed are technically stronger than just appearing random to spacebounded machines, and have several other applications. In particular, applications are given for "deterministic amplification " (i.e. reducing the probability of error of randomized algorithms), as well as generalizations of it. 1.
How to Recycle Random Bits
, 1989
"... We show that modified versions of the linear congruential generator and the shift register generator are provably good for amplifying the correctness of a probabilistic algorithm. More precisely, if r random bits are needed for a BPP algorithm to be correct with probability at least 2=3, then O(r + ..."
Abstract

Cited by 179 (11 self)
 Add to MetaCart
We show that modified versions of the linear congruential generator and the shift register generator are provably good for amplifying the correctness of a probabilistic algorithm. More precisely, if r random bits are needed for a BPP algorithm to be correct with probability at least 2=3, then O(r + k 2 ) bits are needed to improve this probability to 1 \Gamma 2 \Gammak . We also present a different pseudorandom generator that is optimal, up to a constant factor, in this regard: it uses only O(r + k) bits to improve the probability to 1 \Gamma 2 \Gammak . This generator is based on random walks on expanders. Our results do not depend on any unproven assumptions. Next we show that our modified versions of the shift register and linear congruential generators can be used to sample from distributions using, in the limit, the informationtheoretic lower bound on random bits. 1. Introduction Randomness plays a vital role in almost all areas of computer science, both in theory and in...
Simulating BPP Using a General Weak Random Source
 ALGORITHMICA
, 1996
"... We show how to simulate BPP and approximation algorithms in polynomial time using the output from a ffisource. A ffisource is a weak random source that is asked only once for R bits, and must output an Rbit string according to some distribution that places probability no more than 2 \GammaffiR on ..."
Abstract

Cited by 105 (18 self)
 Add to MetaCart
We show how to simulate BPP and approximation algorithms in polynomial time using the output from a ffisource. A ffisource is a weak random source that is asked only once for R bits, and must output an Rbit string according to some distribution that places probability no more than 2 \GammaffiR on any particular string. We also give an application to the unapproximability of Max Clique.
Extracting Randomness: A Survey and New Constructions
, 1999
"... this paper we do two things. First, we survey extractors and dispersers: what they are, how they can be designed, and some of their applications. The work described in the survey is due to a long list of research papers by various authors##most notably by David Zuckerman. Then, we present a new tool ..."
Abstract

Cited by 89 (5 self)
 Add to MetaCart
this paper we do two things. First, we survey extractors and dispersers: what they are, how they can be designed, and some of their applications. The work described in the survey is due to a long list of research papers by various authors##most notably by David Zuckerman. Then, we present a new tool for constructing explicit extractors and give two new constructions that greatly improve upon previous results. The new tool we devise, a merger," is a function that accepts d strings, one of which is uniformly distributed and outputs a single string that is guaranteed to be uniformly distributed. We show how to build good explicit mergers, and how mergers can be used to build better extractors. Using this, we present two new constructions. The first construction succeeds in extracting all of the randomness from any somewhat random source. This improves upon previous extractors that extract only some of the randomness from somewhat random sources with enough" randomness. The amount of truly random bits used by this extractor, however, is not optimal. The second extractor we build extracts only some of the randomness and works only for sources with enough randomness, but uses a nearoptimal amount of truly random bits. Extractors and dispersers have many applications in removing randomness" in various settings and in making randomized constructions explicit. We survey some of these applications and note whenever our new constructions yield better results, e.g., plugging our new extractors into a previous construction we achieve the first explicit Nsuperconcentrators of linear size and polyloglog(N) depth. ] 1999 Academic Press CONTENTS 1.
Extracting all the Randomness and Reducing the Error in Trevisan's Extractors
 In Proceedings of the 31st Annual ACM Symposium on Theory of Computing
, 1999
"... We give explicit constructions of extractors which work for a source of any minentropy on strings of length n. These extractors can extract any constant fraction of the minentropy using O(log² n) additional random bits, and can extract all the minentropy using O(log³ n) additional rando ..."
Abstract

Cited by 78 (16 self)
 Add to MetaCart
We give explicit constructions of extractors which work for a source of any minentropy on strings of length n. These extractors can extract any constant fraction of the minentropy using O(log² n) additional random bits, and can extract all the minentropy using O(log³ n) additional random bits. Both of these constructions use fewer truly random bits than any previous construction which works for all minentropies and extracts a constant fraction of the minentropy. We then improve our second construction and show that we can reduce the entropy loss to 2 log(1=") +O(1) bits, while still using O(log³ n) truly random bits (where entropy loss is defined as [(source minentropy) + (# truly random bits used) (# output bits)], and " is the statistical difference from uniform achieved). This entropy loss is optimal up to a constant additive term. our...
Computing with Very Weak Random Sources
, 1994
"... For any fixed 6> 0, we show how to simulate RP algorithms in time nO(‘Ogn) using the output of a 6source wath minentropy R‘. Such a weak random source is asked once for R bits; it outputs an Rbit string such that any string has probability at most 2Rc. If 6> 1 l/(k + l), our BPP simulations tak ..."
Abstract

Cited by 71 (7 self)
 Add to MetaCart
For any fixed 6> 0, we show how to simulate RP algorithms in time nO(‘Ogn) using the output of a 6source wath minentropy R‘. Such a weak random source is asked once for R bits; it outputs an Rbit string such that any string has probability at most 2Rc. If 6> 1 l/(k + l), our BPP simulations take time no(‘og(k)n) (log(k) is the logarithm iterated k times). We also gave a polynomialtime BPP simulation using ChorGoldreich sources of minentropy Ro(’), which is optimal. We present applications to timespace tradeoffs, expander constructions, and the hardness of approximation. Also of interest is our randomnessefficient Leflover Hash Lemma, found independently by Goldreich & Wigderson.
Extractors with weak random seeds
 In Proceedings of the 37th Annual ACM Symposium on Theory of Computing
, 2005
"... We show how to extract random bits from two or more independent weak random sources in cases where only one source is of linear minentropy and all other sources are of logarithmic minentropy. Our main results are as follows: 1. A long line of research, starting by Nisan and Zuckerman [15], gives e ..."
Abstract

Cited by 62 (6 self)
 Add to MetaCart
We show how to extract random bits from two or more independent weak random sources in cases where only one source is of linear minentropy and all other sources are of logarithmic minentropy. Our main results are as follows: 1. A long line of research, starting by Nisan and Zuckerman [15], gives explicit constructions of seededextractors, that is, extractors that use a short seed of truly random bits to extract randomness from a weak random source. For every such extractor E, with seed of length d, we construct an extractor E ′ , with seed of length d ′ = O(d), that achieves the same parameters as E but only requires the seed to be of minentropy larger than (1/2 + δ) · d ′ (rather than fully random), where δ is an arbitrary small constant. 2. Fundamental results of Chor and Goldreich and Vazirani [6, 22] show how to extract Ω(n) random bits from two (independent) sources of length n and minentropy larger than (1/2 + δ) · n, where δ is an arbitrary small constant. We show how to extract Ω(n) random bits (with optimal probability of error) when only one source is of minentropy (1/2 + δ) · n and the other source is of logarithmic minentropy. 1 3. A recent breakthrough of Barak, Impagliazzo and Wigderson [4] shows how to extract Ω(n) random bits from a constant number of (independent) sources of length n and minentropy larger than δn, where δ is an arbitrary small constant. We show how to extract Ω(n) random bits (with optimal probability of error) when only one source is of minentropy δn and all other (constant number of) sources are of logarithmic minentropy. 4. A very recent result of Barak, Kindler, Shaltiel, Sudakov and Wigderson [5] shows how to extract a constant number of random bits from three (independent) sources of length n and minentropy larger than δn, where δ is an arbitrary small constant. We show how to extract Ω(n) random bits, with subconstant probability of error, from one source of minentropy δn and two sources of logarithmic minentropy.
Bucket Hashing and its Application to Fast Message Authentication
, 1995
"... We introduce a new technique for constructing a family of universal hash functions. ..."
Abstract

Cited by 51 (4 self)
 Add to MetaCart
We introduce a new technique for constructing a family of universal hash functions.
Extracting randomness using few independent sources
 In Proceedings of the 45th Annual IEEE Symposium on Foundations of Computer Science
, 2004
"... In this work we give the first deterministic extractors from a constant number of weak sources whose entropy rate is less than 1/2. Specifically, for every δ> 0 we give an explicit construction for extracting randomness from a constant (depending polynomially on 1/δ) number of distributions over {0, ..."
Abstract

Cited by 48 (6 self)
 Add to MetaCart
In this work we give the first deterministic extractors from a constant number of weak sources whose entropy rate is less than 1/2. Specifically, for every δ> 0 we give an explicit construction for extracting randomness from a constant (depending polynomially on 1/δ) number of distributions over {0, 1} n, each having minentropy δn. These extractors output n bits, which are 2 −n close to uniform. This construction uses several results from additive number theory, and in particular a recent one by Bourgain, Katz and Tao [BKT03] and of Konyagin [Kon03]. We also consider the related problem of constructing randomness dispersers. For any constant output length m, our dispersers use a constant number of identical distributions, each with minentropy Ω(log n) and outputs every possible mbit string with positive probability. The main tool we use is a variant of the “steppingup lemma ” used in establishing lower bound
Weak Random Sources, Hitting Sets, and BPP Simulations
, 1998
"... We show how to simulate any BPP algorithm in polynomial time using a weak random source of r bits and minentropy r fl for any fl ? 0. This follows from a more general result about sampling with weak random sources. Our result matches an informationtheoretic lower bound and solves a question that ..."
Abstract

Cited by 40 (5 self)
 Add to MetaCart
We show how to simulate any BPP algorithm in polynomial time using a weak random source of r bits and minentropy r fl for any fl ? 0. This follows from a more general result about sampling with weak random sources. Our result matches an informationtheoretic lower bound and solves a question that has been open for some years. The previous best results were a polynomial time simulation of RP [Saks, Srinivasan and Zhou 1995] and a quasipolynomial time simulation of BPP [TaShma 1996]. Departing significantly from previous related works, we do not use extractors; instead, we use the ORdisperser of [Saks, Srinivasan, and Zhou 1995] in combination with a tricky use of hitting sets borrowed from [Andreev, Clementi, and Rolim 1996]. AMS Subject Classification: 68Q10, 11K45. Key Words and Phrases: Derandomization, Imperfect Sources of Randomness, Hitting Sets, Randomized Computations, Expander Graphs. Abbreviated Title: BPP Simulations using Weak Random Sources. 1 Introduction Randomi...