Results 11  20
of
45
Deterministic Extractors For SmallSpace Sources
, 2006
"... We give polynomialtime, deterministic randomness extractors for sources generated in small space, where we model space s sources on {0, 1} n as sources generated by width 2 s branching programs: For every constant δ> 0, we can extract.99δn bits that are exponentially close to uniform (in variati ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
(Show Context)
We give polynomialtime, deterministic randomness extractors for sources generated in small space, where we model space s sources on {0, 1} n as sources generated by width 2 s branching programs: For every constant δ> 0, we can extract.99δn bits that are exponentially close to uniform (in variation distance) from space s sources of minentropy δn, where s = Ω(n). In addition, assuming an efficient deterministic algorithm for finding large primes, there is a constant η> 0 such that for any ζ> n −η, we can extract m = (δ − ζ)n bits that are exponentially close to uniform from space s sources with minentropy δn, where s = Ω(β 3 n). Previously, nothing was known for δ ≤ 1/2, even for space 0. Our results are obtained by a reduction to a new class of sources that we call independentsymbol sources, which generalize both the wellstudied models of independent sources and symbolfixing sources. These sources consist of a string of n independent symbols over a d symbol alphabet with minentropy k. We give deterministic extractors for such sources when k is as small as polylog(n), for small enough d.
On Deterministic Approximation of DNF
 In Proceedings of STOC'91
, 1993
"... We develop efficient deterministic algorithms for approximating the fraction of truth assignments that satisfy a disjunctive normal form formula. Although the algorithms themselves are deterministic, their analysis is probabilistic and uses the notion of limited independence between random variables ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
We develop efficient deterministic algorithms for approximating the fraction of truth assignments that satisfy a disjunctive normal form formula. Although the algorithms themselves are deterministic, their analysis is probabilistic and uses the notion of limited independence between random variables. International Computer Science Institute, 1947 Center Street, Berkeley, California 94704 and Computer Science Department, UC Berkeley, research partially supported by NSF operating grant CCR9016468 and by grant No. 8900312 from the United StatesIsrael Binational Science Foundation (BSF), Jerusalem, Israel. y Department of Mathematics, U.C. Berkeley, research partially supported by NSF, research partially done while visiting the International Computer Science Institute ii 1 Introduction Throughout this paper, let F denote a formula in disjunctive normal form (DNF) on n variables with m clauses of length at most t, and let Pr[F ] denote the probability that a random, independent and...
Products and Help Bits in Decision Trees
, 1994
"... We investigate two problems concerning the complexity of evaluating a function f at a ktuple of unrelated inputs by k parallel decision tree algorithms. In the product problem, for some fixed depth bound d, we seek to maximize the fraction of input ktuples for which all k decision trees are co ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
We investigate two problems concerning the complexity of evaluating a function f at a ktuple of unrelated inputs by k parallel decision tree algorithms. In the product problem, for some fixed depth bound d, we seek to maximize the fraction of input ktuples for which all k decision trees are correct. Assume that for a single input to f , the best decision tree algorithm of depth d is correct on a fraction p of inputs. We prove that the maximum fraction of ktuples on which k depth d algorithms are all correct is at most p k , which is the trivial lower bound. We show that if we replace the depth d restriction by "expected depth d", then this result fails. In the helpbit problem, we are permitted to ask k \Gamma 1 arbitrary binary questions about the ktuple of inputs. For each possible k \Gamma 1tuple of answers to these queries we will have a ktuple of decision trees which are supposed to correctly compute all functions on ktuples that are consistent with the part...
Improved Randomness Extraction from Two Independent Sources
 In Proc. of 8th RANDOM
, 2004
"... Given two independent weak random sources X, Y , with the same length \ell and minentropies bX,bY whose sum is greater than \ell \Omega(polylog(\ell/\epsilon)), we construct a deterministic twosource extractor (aka "blender") that extracts max(bX,bY) (bX bY\ell4log(1/\epsilon)) bits wh ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
(Show Context)
Given two independent weak random sources X, Y , with the same length \ell and minentropies bX,bY whose sum is greater than \ell \Omega(polylog(\ell/\epsilon)), we construct a deterministic twosource extractor (aka "blender") that extracts max(bX,bY) (bX bY\ell4log(1/\epsilon)) bits which are \epsilonclose to uniform. In contrast, best previously published construction [4] extracted at most 2(bX bY\ell2log(1/\epsilon)) bits. Our main technical tool is a construction of a strong twosource extractor that extracts (bX bY\ell2log(1/\epsilon)) bits which are \epsilonclose to being uniform and independent of one of the sources(aka "strong blender"), so that they can later be reused as a seed to a seeded extractor. Our strong twosource extractor construction improves the best previously published construction of such strong blenders [7] by a factor of 2, applies to more sources X and Y , and is considerably simpler than the latter. Our methodology also unifies several of the previous twosource extractor constructions from the literature.
On extracting private randomness over a public channel
 In Proc. RANDOM ’03
, 2003
"... We introduce the notion of a superstrong extractor. Given two independent weak random sources X, Y, such extractor EXT(·, ·) has the property that EXT(X, Y) is statistically random even if one is given Y. Namely, 〈Y, EXT(X, Y) 〉 ≈ 〈Y, R〉. Superstrong extractors generalize the notion of strong ext ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
We introduce the notion of a superstrong extractor. Given two independent weak random sources X, Y, such extractor EXT(·, ·) has the property that EXT(X, Y) is statistically random even if one is given Y. Namely, 〈Y, EXT(X, Y) 〉 ≈ 〈Y, R〉. Superstrong extractors generalize the notion of strong extractors [16], which assume that Y is truly random, and extractors from two weak random sources [26, 7] which only assure that EXT(X, Y) ≈ R. We show that superextractors have many natural applications to design of cryptographic systems in a setting when different parties have independent weak sources of randomness, but have to communicate over an insecure channel. For example, they allow one party to “help ” other party extract private randomness: the “helper ” simply sends Y, and the “client ” gets private randomness EXT(X, Y). In particular, it allows two parties to derive a nearly random key after initial agreement on only a weak shared key, without using ideal local randomness. We show that optimal superstrong extractors exist, which are capable of extracting all the randomness from X, as long as Y has a logarithmic amount of minentropy. This generalizes a similar result from strong extractors, and improves upon previously known bounds [7] for a weaker problem of randomness extraction from two independent random sources. We also give explicit superstrong extractors which work provided the
Approximating Probability Distributions Using Small Sample Spaces
 Combinatorica
, 1995
"... We formulate the notion of a "good approximation" to a probability distribution over a finite abelian group. The approximate distribution is characterized by a parameter ffl, the quality of the approximation, which is a bound on the difference between corresponding Fourier coefficients of ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
(Show Context)
We formulate the notion of a "good approximation" to a probability distribution over a finite abelian group. The approximate distribution is characterized by a parameter ffl, the quality of the approximation, which is a bound on the difference between corresponding Fourier coefficients of the two distributions. It is also required that the sample space of the approximate distribution be of size polynomial in the representation length of the group elements as well as 1=ffl. Such approximations are useful in reducing or eliminating the use of randomness in randomized algorithms. We demonstrate the existence of such good approximations to arbitrary distributions. In the case of n random variables distributed uniformly and independently over the range f0; : : : ; d \Gamma 1g, we provide an efficient construction of a good approximation. The constructed approximation has the property that any linear combination of the random variables (modulo d) has essentially the same behavior under the ...
Constructions of NearOptimal Extractors Using PseudoRandom Generators
 Electronic Colloquium on Computational Complexity
, 1998
"... We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction that is simple to describe and analyze, does not use any of the standard techniques used in related results, and improves or subsumes almost all the previous constructions. 1 Introduction Informally defined, an extractor is a function that extracts randomness from a weakly random distribution. Explicit constructions of extractors have several applications and are typically very hard to achieve. In this paper we introduce a new approach to the explicit construction of extractors. Our approach yields a construction that improves most of the known results, and that is optimal for certain parameters. Furthermore, our construction is simple and uses techniques that were never used in this field...
Three xorlemmas  an exposition
 Electronic Colloquium on Computational Complexity (ECCC
, 1995
"... Abstract. We provide an exposition of three lemmas that relate general properties of distributions over bit strings to the exclusiveor (xor) of values of certain bit locations. The first XORLemma, commonly attributed to Umesh Vazirani (1986), relates the statistical distance of a distribution from ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
Abstract. We provide an exposition of three lemmas that relate general properties of distributions over bit strings to the exclusiveor (xor) of values of certain bit locations. The first XORLemma, commonly attributed to Umesh Vazirani (1986), relates the statistical distance of a distribution from the uniform distribution over bit strings to the maximum bias of the xor of certain bit positions. The second XORLemma, due to Umesh and Vijay Vazirani (19th STOC, 1987), is a computational analogue of the first. It relates the pseudorandomness of a distribution to the difficulty of predicting the xor of bits in particular or random positions. The third Lemma, due to Goldreich and Levin (21st STOC, 1989), relates the difficulty of retrieving a string and the unpredictability of the xor of random bit positions. The most notable XOR Lemma – that is the socalled Yao XOR Lemma – is not discussed here. We focus on the proofs of the aforementioned three lemma. Our exposition deviates from the original proofs, yielding proofs that are believed to be simpler, of wider applicability, and establishing somewhat stronger quantitative results. Credits for these improved proofs are due to several researchers.
Biased random walks
 In Proceedings of the TwentyFourth Annual ACM Symposium on the Theory of Computing
, 1992
"... How much can an imperfect source of randomness affect an algorithm? We examine several simple questions of this type concerning the longterm behavior of a random walk on a finite graph. In our setup, at each step of the random walk a “controller ” can, with a certain small probability, fix the next ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
(Show Context)
How much can an imperfect source of randomness affect an algorithm? We examine several simple questions of this type concerning the longterm behavior of a random walk on a finite graph. In our setup, at each step of the random walk a “controller ” can, with a certain small probability, fix the next step, thus introducing a bias. We analyze the extent to which the bias can affect the limit behavior of the walk. The controller is assumed to associate a real, nonnegative, “benefit ” with each state, and to strive to maximize the longterm expected benefit. We derive tight bounds on the maximum of this objective function over all controller’s strategies, and present polynomial time algorithms for computing the optimal controller strategy. 1
Games Computers Play: GameTheoretic Aspects of Computing
 In
, 1992
"... this article is on protocols allowing the wellfunctioning parts of such a large and complex system to carry out their work despite the failure of others. Many deep and interesting results on such problems have been discovered by computer scientists in recent years, the incorporation of which into g ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
this article is on protocols allowing the wellfunctioning parts of such a large and complex system to carry out their work despite the failure of others. Many deep and interesting results on such problems have been discovered by computer scientists in recent years, the incorporation of which into game theory can greatly enrich this field