Results 1  10
of
26
Extractors with weak random seeds
 In Proceedings of the 37th Annual ACM Symposium on Theory of Computing
, 2005
"... We show how to extract random bits from two or more independent weak random sources in cases where only one source is of linear minentropy and all other sources are of logarithmic minentropy. Our main results are as follows: 1. A long line of research, starting by Nisan and Zuckerman [15], gives e ..."
Abstract

Cited by 83 (5 self)
 Add to MetaCart
(Show Context)
We show how to extract random bits from two or more independent weak random sources in cases where only one source is of linear minentropy and all other sources are of logarithmic minentropy. Our main results are as follows: 1. A long line of research, starting by Nisan and Zuckerman [15], gives explicit constructions of seededextractors, that is, extractors that use a short seed of truly random bits to extract randomness from a weak random source. For every such extractor E, with seed of length d, we construct an extractor E ′ , with seed of length d ′ = O(d), that achieves the same parameters as E but only requires the seed to be of minentropy larger than (1/2 + δ) · d ′ (rather than fully random), where δ is an arbitrary small constant. 2. Fundamental results of Chor and Goldreich and Vazirani [6, 22] show how to extract Ω(n) random bits from two (independent) sources of length n and minentropy larger than (1/2 + δ) · n, where δ is an arbitrary small constant. We show how to extract Ω(n) random bits (with optimal probability of error) when only one source is of minentropy (1/2 + δ) · n and the other source is of logarithmic minentropy. 1 3. A recent breakthrough of Barak, Impagliazzo and Wigderson [4] shows how to extract Ω(n) random bits from a constant number of (independent) sources of length n and minentropy larger than δn, where δ is an arbitrary small constant. We show how to extract Ω(n) random bits (with optimal probability of error) when only one source is of minentropy δn and all other (constant number of) sources are of logarithmic minentropy. 4. A very recent result of Barak, Kindler, Shaltiel, Sudakov and Wigderson [5] shows how to extract a constant number of random bits from three (independent) sources of length n and minentropy larger than δn, where δ is an arbitrary small constant. We show how to extract Ω(n) random bits, with subconstant probability of error, from one source of minentropy δn and two sources of logarithmic minentropy.
Deterministic Extractors for Affine Sources over Large Fields
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY, REPORT NO. 108 (2005)
, 2005
"... An (n, k)affine source over a finite field F is a random variable X = (X1,..., Xn) ∈ Fn, which is uniformly distributed over an (unknown) kdimensional affine subspace of F n. We show how to (deterministically) extract practically all the randomness from affine sources, for any field of size large ..."
Abstract

Cited by 48 (7 self)
 Add to MetaCart
An (n, k)affine source over a finite field F is a random variable X = (X1,..., Xn) ∈ Fn, which is uniformly distributed over an (unknown) kdimensional affine subspace of F n. We show how to (deterministically) extract practically all the randomness from affine sources, for any field of size larger than n c (where c is a large enough constant). Our main results are as follows: 1. (For arbitrary k): For any n, k and any F of size larger than n 20, we give an explicit construction for a function D: F n → F k−1, such that for any (n, k)affine source X over F, the distribution of D(X) is ɛclose to uniform, where ɛ is polynomially small in F. 2. (For k = 1): For any n and any F of size larger than n c, we give an explicit construction for a function D: F n → {0, 1} (1−δ) log 2 F  , such that for any (n, 1)affine source X over F, the distribution of D(X) is ɛclose to uniform, where ɛ is polynomially small in F. Here, δ> 0 is an arbitrary small constant, and c is a constant depending on δ.
Simulating Independence: New Constructions of Condensers, Ramsey Graphs, Dispersers, and Extractors
 In Proceedings of the 37th Annual ACM Symposium on Theory of Computing
, 2005
"... We present new explicit constructions of deterministic randomness extractors, dispersers and related objects. More precisely, a distribution X over binary strings of length n is called a δsource if it assigns probability at most 2 −δn to any string of length n, and for any δ> 0 we construct the ..."
Abstract

Cited by 47 (10 self)
 Add to MetaCart
(Show Context)
We present new explicit constructions of deterministic randomness extractors, dispersers and related objects. More precisely, a distribution X over binary strings of length n is called a δsource if it assigns probability at most 2 −δn to any string of length n, and for any δ> 0 we construct the following poly(n)time computable functions: 2source disperser: D: ({0, 1} n) 2 → {0, 1} such that for any two independent δsources X1, X2 we have that the support of D(X1, X2) is {0, 1}. Bipartite Ramsey graph: Let N = 2 n. A corollary is that the function D is a 2coloring of the edges of KN,N (the complete bipartite graph over two sets of N vertices) such that any induced subgraph of size N δ by N δ is not monochromatic. 3source extractor: E: ({0, 1} n) 2 → {0, 1} such that for any three independent δsources X1, X2, X3 we have that E(X1, X2, X3) is (o(1)close to being) an unbiased random bit. No previous explicit construction was known for either of these, for any δ < 1/2 and these results constitute major progress to longstanding open problems. A component in these results is a new construction of condensers that may be of independent
On the (im)possibility of cryptography with imperfect randomness
 In Proc. 45th IEEE FOCS
, 2004
"... We investigate the feasibility of a variety of cryptographic tasks with imperfect randomness. The kind of imperfect randomness we consider are entropy sources, such as those considered by Santha and Vazirani, Chor and Goldreich, and Zuckerman. We show the following: Certain cryptographic tasks like ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
(Show Context)
We investigate the feasibility of a variety of cryptographic tasks with imperfect randomness. The kind of imperfect randomness we consider are entropy sources, such as those considered by Santha and Vazirani, Chor and Goldreich, and Zuckerman. We show the following: Certain cryptographic tasks like bit commitment, encryption, secret sharing, zeroknowledge, noninteractive zeroknowledge, and secure twoparty computation for any nontrivial function are impossible to realize if parties have access to entropy sources with slightly lessthanperfect entropy, i.e., sources with imperfect randomness. These results are unconditional and do not rely on any unproven assumption. On the other hand, based on stronger variants of standard assumptions, secure signature schemes are possible with imperfect entropy sources. As another positive result, we show (without any unproven assumption) that interactive proofs can be made sound with respect to imperfect entropy sources. 1.
Simpler sessionkey generation from short random passwords
 IN 1ST TCC
, 2004
"... Goldreich and Lindell (CRYPTO `01) recently presented the first protocol for passwordauthenticated key exchange in the standard model (with no common reference string or setup assumptions other than the shared password). However, their protocol uses several heavy tools and has a complicated analys ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
(Show Context)
Goldreich and Lindell (CRYPTO `01) recently presented the first protocol for passwordauthenticated key exchange in the standard model (with no common reference string or setup assumptions other than the shared password). However, their protocol uses several heavy tools and has a complicated analysis. We present a simplification of the GoldreichLindell (GL) protocol and analysis for the special case when the dictionary is of the form D = {0, 1}d i.e., the password is a short string chosen uniformly at random (in the spirit of an ATM PIN number). The security bound achieved by our protocol is somewhat worse than the GL protocol. Roughly speaking, our protocol guarantees that the adversary can ibreakj the scheme with probability at most O(poly(n)/D)\Omega (1), whereas the GL protocol guarantees a bound of O(1/D). We also
Harvesting verifiable challenges from oblivious online sources
 In Proc. 14th ACM Conference on Computer and Communications Security (CCS 07
, 2007
"... Several important security protocols require parties to perform computations based on random challenges. Traditionally, proving that the challenges were randomly chosen has required interactive communication among the parties or the existence of a trusted server. We offer an alternative solution whe ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
Several important security protocols require parties to perform computations based on random challenges. Traditionally, proving that the challenges were randomly chosen has required interactive communication among the parties or the existence of a trusted server. We offer an alternative solution where challenges are harvested from oblivious servers on the Internet. This paper describes a framework for deriving “harvested challenges ” by mixing data from various preexisting online sources. While individual sources may become predictable or fall under adversarial control, we provide a policy language that allows application developers to specify combinations of sources that meet their security needs. Participants can then convince each other that their challenges were formed freshly and in accordance with the policy. We present Combine, an open source implementation of our framework, and show how it can be applied to a variety of applications, including remote storage auditing and noninteractive client puzzles.
Increasing the Output Length of ZeroError Dispersers
, 2008
"... Let C be a class of probability distributions over a finite set Ω. A function D: Ω ↦ → {0, 1} m is a disperser for C with entropy threshold k and error ɛ if for any distribution X in C such that X gives positive probability to at least 2k elements we have that the distribution D(X) gives positive pr ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
(Show Context)
Let C be a class of probability distributions over a finite set Ω. A function D: Ω ↦ → {0, 1} m is a disperser for C with entropy threshold k and error ɛ if for any distribution X in C such that X gives positive probability to at least 2k elements we have that the distribution D(X) gives positive probability to at least (1 − ɛ)2m elements. A long line of research is devoted to giving explicit (that is polynomial time computable) dispersers (and related objects called “extractors”) for various classes of distributions while trying to maximize m as a function of k. In this paper we are interested in explicitly constructing zeroerror dispersers (that is dispersers with error ɛ = 0). For several interesting classes of distributions there are explicit constructions in the literature of zeroerror dispersers with “small ” output length m and we give improved constructions that achieve “large ” output length, namely m = Ω(k). We achieve this by developing a general technique to improve the output length of zeroerror dispersers (namely, to transform a disperser with short output length into one with large output length). This strategy works for several classes of sources and is inspired by a transformation that improves the output length of extractors (which was given in [31] building on earlier work
Simple Affine Extractors using Dimension Expansion
, 2009
"... Let Fq be the field of q elements. An (n, k)affine extractor is a mapping D: F n q → {0, 1} such that for any kdimensional affine subspace X ⊆ F n q, D(x) is an almost unbiased bit when x is chosen uniformly from X. Loosely speaking, the problem of explicitly constructing affine extractors gets ha ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Let Fq be the field of q elements. An (n, k)affine extractor is a mapping D: F n q → {0, 1} such that for any kdimensional affine subspace X ⊆ F n q, D(x) is an almost unbiased bit when x is chosen uniformly from X. Loosely speaking, the problem of explicitly constructing affine extractors gets harder as q gets smaller and easier as k gets larger. This is reflected in previous results: When q is ‘large enough’, specifically q = Ω(n 2), Gabizon and Raz [3] construct affine extractors for any k ≥ 1. In the ‘hardest case’, i.e. when q = 2, Bourgain [2] constructs affine extractors for k ≥ δn for any constant (and even slightly subconstant) δ> 0. Our main result is the following: Fix any k ≥ 2 and let d = 5n/k. Then whenever q> 2 · d 2 and p = char(Fq)> d, we give an explicit (n, k)affine extractor. For example, when k = δn for constant δ> 0, we get an extractor for a field of constant size Ω ( () 1 2). δ Thus our result may be viewed as a ‘fieldsize/dimension ’ tradeoff for affine extractors. Although for large k we are not able to improve (or even match) the previous result of [2], our construction and proof have the advantage of being very simple: Assume n is prime and d is odd, and fix any nontrivial linear map T: Fn q ↦ → Fq. Define QR: Fq ↦ → {0, 1} by QR(x) = 1 if and only if x is a quadratic residue. Then, the function D: F n q ↦ → {0, 1} defined by D(x) � QR(T (x d)) is an (n, k)affine extractor. Our proof uses a result of Heur, Leung and Xiang [4] giving a lower bound on the dimension of products of subspaces. 1
S.: Randomness condensers for efficiently samplable, seeddependent sources, full version of this paper. Available from authors’ websites
"... We initiate a study of randomness condensers for sources that are efficiently samplable but may depend on the seed of the condenser. That is, we seek functions Cond: {0, 1} n × {0, 1} d → {0, 1} m such that if we choose a random seed S ← {0, 1} d, and a source X = A(S) is generated by a randomized c ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
We initiate a study of randomness condensers for sources that are efficiently samplable but may depend on the seed of the condenser. That is, we seek functions Cond: {0, 1} n × {0, 1} d → {0, 1} m such that if we choose a random seed S ← {0, 1} d, and a source X = A(S) is generated by a randomized circuit A of size t such that X has minentropy at least k given S, then Cond(X; S) should have minentropy at least some k ′ given S. The distinction from the standard notion of randomness condensers is that the source X may be correlated with the seed S (but is restricted to be efficiently samplable). Randomness extractors of this type (corresponding to the special case where k ′ = m) have been implicitly studied in the past (by Trevisan and Vadhan, FOCS ‘00). We show that: • Unlike extractors, we can have randomness condensers for samplable, seeddependent sources whose computational complexity is smaller than the size t of the adversarial sampling algorithm A. Indeed, we show that sufficiently strong collisionresistant hash functions are seeddependent condensers that produce outputs with minentropy k ′ = m − O(log t), i.e. logarithmic entropy deficiency.
Impossibility of independence amplification in Kolmogorov Complexity Theory
"... The paper studies randomness extraction from sources with bounded independence and the issue of independence amplification of sources, using the framework of Kolmogorov complexity. The dependency of strings x and y is dep(x, y) = max{C(x) − C(x  y), C(y) − C(y  x)}, where C(·) denotes the Kol ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
The paper studies randomness extraction from sources with bounded independence and the issue of independence amplification of sources, using the framework of Kolmogorov complexity. The dependency of strings x and y is dep(x, y) = max{C(x) − C(x  y), C(y) − C(y  x)}, where C(·) denotes the Kolmogorov complexity. It is shown that there exists a computable Kolmogorov extractor f such that, for any two nbit strings with complexity s(n) and dependency α(n), it outputs a string of length s(n) with complexity s(n) − α(n) conditioned by any one of the input strings. It is proven that the above are the optimal parameters a Kolmogorov extractor can achieve. It is shown that independence amplification cannot be effectively realized. Specifically, if (after excluding a trivial case) there exist computable functions f1 and f2 such that dep(f1(x, y), f2(x, y)) ≤ β(n) for all nbit strings x and y with dep(x, y) ≤ α(n), then β(n) ≥ α(n) − O(log n).