Results 11  20
of
37
On the (Non)Universality of the OneTime Pad
 In Proc. 43rd FOCS
, 2002
"... Randomization is vital in cryptography: secret keys should be randomly generated and most cryptographic primitives (e.g., encryption) must be probabilistic. As a common abstraction, it is assumed that there is a source of truly random bits available to all the participants of the system. While conve ..."
Abstract

Cited by 23 (12 self)
 Add to MetaCart
Randomization is vital in cryptography: secret keys should be randomly generated and most cryptographic primitives (e.g., encryption) must be probabilistic. As a common abstraction, it is assumed that there is a source of truly random bits available to all the participants of the system. While convenient, this assumption is often highly unrealistic, and cryptographic systems have to be built based on imperfect sources of randomness. Remarkably, this fundamental problem has received little or no attention so far, despite the fact that a related question of simulating probabilistic (BPP) algorithms with imperfect random sources has a long and rich history.
Products and Help Bits in Decision Trees
, 1994
"... We investigate two problems concerning the complexity of evaluating a function f at a ktuple of unrelated inputs by k parallel decision tree algorithms. In the product problem, for some fixed depth bound d, we seek to maximize the fraction of input ktuples for which all k decision trees are co ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
We investigate two problems concerning the complexity of evaluating a function f at a ktuple of unrelated inputs by k parallel decision tree algorithms. In the product problem, for some fixed depth bound d, we seek to maximize the fraction of input ktuples for which all k decision trees are correct. Assume that for a single input to f , the best decision tree algorithm of depth d is correct on a fraction p of inputs. We prove that the maximum fraction of ktuples on which k depth d algorithms are all correct is at most p k , which is the trivial lower bound. We show that if we replace the depth d restriction by "expected depth d", then this result fails. In the helpbit problem, we are permitted to ask k \Gamma 1 arbitrary binary questions about the ktuple of inputs. For each possible k \Gamma 1tuple of answers to these queries we will have a ktuple of decision trees which are supposed to correctly compute all functions on ktuples that are consistent with the part...
Improved Randomness Extraction from Two Independent Sources
 In Proc. of 8th RANDOM
, 2004
"... Given two independent weak random sources X, Y , with the same length \ell and minentropies bX,bY whose sum is greater than \ell \Omega(polylog(\ell/\epsilon)), we construct a deterministic twosource extractor (aka "blender") that extracts max(bX,bY) (bX bY\ell4log(1/\epsilon)) bits which are \e ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
Given two independent weak random sources X, Y , with the same length \ell and minentropies bX,bY whose sum is greater than \ell \Omega(polylog(\ell/\epsilon)), we construct a deterministic twosource extractor (aka "blender") that extracts max(bX,bY) (bX bY\ell4log(1/\epsilon)) bits which are \epsilonclose to uniform. In contrast, best previously published construction [4] extracted at most 2(bX bY\ell2log(1/\epsilon)) bits. Our main technical tool is a construction of a strong twosource extractor that extracts (bX bY\ell2log(1/\epsilon)) bits which are \epsilonclose to being uniform and independent of one of the sources(aka "strong blender"), so that they can later be reused as a seed to a seeded extractor. Our strong twosource extractor construction improves the best previously published construction of such strong blenders [7] by a factor of 2, applies to more sources X and Y , and is considerably simpler than the latter. Our methodology also unifies several of the previous twosource extractor constructions from the literature.
Constructions of NearOptimal Extractors Using PseudoRandom Generators
 Electronic Colloquium on Computational Complexity
, 1998
"... We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction that is simple to describe and analyze, does not use any of the standard techniques used in related results, and improves or subsumes almost all the previous constructions. 1 Introduction Informally defined, an extractor is a function that extracts randomness from a weakly random distribution. Explicit constructions of extractors have several applications and are typically very hard to achieve. In this paper we introduce a new approach to the explicit construction of extractors. Our approach yields a construction that improves most of the known results, and that is optimal for certain parameters. Furthermore, our construction is simple and uses techniques that were never used in this field...
On extracting private randomness over a public channel
 In Proc. RANDOM ’03
, 2003
"... We introduce the notion of a superstrong extractor. Given two independent weak random sources X, Y, such extractor EXT(·, ·) has the property that EXT(X, Y) is statistically random even if one is given Y. Namely, 〈Y, EXT(X, Y) 〉 ≈ 〈Y, R〉. Superstrong extractors generalize the notion of strong ext ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
We introduce the notion of a superstrong extractor. Given two independent weak random sources X, Y, such extractor EXT(·, ·) has the property that EXT(X, Y) is statistically random even if one is given Y. Namely, 〈Y, EXT(X, Y) 〉 ≈ 〈Y, R〉. Superstrong extractors generalize the notion of strong extractors [16], which assume that Y is truly random, and extractors from two weak random sources [26, 7] which only assure that EXT(X, Y) ≈ R. We show that superextractors have many natural applications to design of cryptographic systems in a setting when different parties have independent weak sources of randomness, but have to communicate over an insecure channel. For example, they allow one party to “help ” other party extract private randomness: the “helper ” simply sends Y, and the “client ” gets private randomness EXT(X, Y). In particular, it allows two parties to derive a nearly random key after initial agreement on only a weak shared key, without using ideal local randomness. We show that optimal superstrong extractors exist, which are capable of extracting all the randomness from X, as long as Y has a logarithmic amount of minentropy. This generalizes a similar result from strong extractors, and improves upon previously known bounds [7] for a weaker problem of randomness extraction from two independent random sources. We also give explicit superstrong extractors which work provided the
Approximating Probability Distributions Using Small Sample Spaces
 Combinatorica
, 1995
"... We formulate the notion of a "good approximation" to a probability distribution over a finite abelian group. The approximate distribution is characterized by a parameter ffl, the quality of the approximation, which is a bound on the difference between corresponding Fourier coefficients of the two d ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
We formulate the notion of a "good approximation" to a probability distribution over a finite abelian group. The approximate distribution is characterized by a parameter ffl, the quality of the approximation, which is a bound on the difference between corresponding Fourier coefficients of the two distributions. It is also required that the sample space of the approximate distribution be of size polynomial in the representation length of the group elements as well as 1=ffl. Such approximations are useful in reducing or eliminating the use of randomness in randomized algorithms. We demonstrate the existence of such good approximations to arbitrary distributions. In the case of n random variables distributed uniformly and independently over the range f0; : : : ; d \Gamma 1g, we provide an efficient construction of a good approximation. The constructed approximation has the property that any linear combination of the random variables (modulo d) has essentially the same behavior under the ...
Three xorlemmas  an exposition
 Electronic Colloquium on Computational Complexity (ECCC
, 1995
"... Abstract. We provide an exposition of three lemmas that relate general properties of distributions over bit strings to the exclusiveor (xor) of values of certain bit locations. The first XORLemma, commonly attributed to Umesh Vazirani (1986), relates the statistical distance of a distribution from ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Abstract. We provide an exposition of three lemmas that relate general properties of distributions over bit strings to the exclusiveor (xor) of values of certain bit locations. The first XORLemma, commonly attributed to Umesh Vazirani (1986), relates the statistical distance of a distribution from the uniform distribution over bit strings to the maximum bias of the xor of certain bit positions. The second XORLemma, due to Umesh and Vijay Vazirani (19th STOC, 1987), is a computational analogue of the first. It relates the pseudorandomness of a distribution to the difficulty of predicting the xor of bits in particular or random positions. The third Lemma, due to Goldreich and Levin (21st STOC, 1989), relates the difficulty of retrieving a string and the unpredictability of the xor of random bit positions. The most notable XOR Lemma – that is the socalled Yao XOR Lemma – is not discussed here. We focus on the proofs of the aforementioned three lemma. Our exposition deviates from the original proofs, yielding proofs that are believed to be simpler, of wider applicability, and establishing somewhat stronger quantitative results. Credits for these improved proofs are due to several researchers.
Biased random walks
 In Proceedings of the TwentyFourth Annual ACM Symposium on the Theory of Computing
, 1992
"... How much can an imperfect source of randomness affect an algorithm? We examine several simple questions of this type concerning the longterm behavior of a random walk on a finite graph. In our setup, at each step of the random walk a “controller ” can, with a certain small probability, fix the next ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
How much can an imperfect source of randomness affect an algorithm? We examine several simple questions of this type concerning the longterm behavior of a random walk on a finite graph. In our setup, at each step of the random walk a “controller ” can, with a certain small probability, fix the next step, thus introducing a bias. We analyze the extent to which the bias can affect the limit behavior of the walk. The controller is assumed to associate a real, nonnegative, “benefit ” with each state, and to strive to maximize the longterm expected benefit. We derive tight bounds on the maximum of this objective function over all controller’s strategies, and present polynomial time algorithms for computing the optimal controller strategy. 1
Explicit ordispersers with polylogarithmic degree
 J. ACM
, 1998
"... An (N,M,T)ORdisperser is a bipartite multigraph G = (V,W,E) withV  = N, and W  = M, having the following expansion property: any subset of V having at least T vertices has a neighbor set of size at least M/2. For any pair of constants ξ,λ,1 ≥ ξ>λ ≥ 0, any sufficiently large N, andforany (log ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
An (N,M,T)ORdisperser is a bipartite multigraph G = (V,W,E) withV  = N, and W  = M, having the following expansion property: any subset of V having at least T vertices has a neighbor set of size at least M/2. For any pair of constants ξ,λ,1 ≥ ξ>λ ≥ 0, any sufficiently large N, andforany (log N)ξ (log N)λ T ≥ 2, M ≤ 2, we give an explicit elementary construction of an (N,M,T)ORdisperser such that the outdegree of any vertex in V is at most polylogarithmic in N. Using this with known applications of ORdispersers yields several results. First, our construction implies that the complexity class StrongRP defined by Sipser, equals RP. Second, for any fixed η>0, we give the first polynomialtime simulation of RP algorithms using the output of any “ηminimally random ” source. For any integral R>0, such a source accepts a single request for an Rbit string and generates the string according to a distribution that assigns probability at most 2−Rη to any string. It is minimally random in the sense that any weaker source is