Results 11  20
of
34
DETERMINISTIC EXTRACTORS FOR BITFIXING SOURCES BY OBTAINING AN INDEPENDENT SEED
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY
, 2005
"... An (n, k)bitfixing source is a distribution X over {0, 1} n such that there is a subset of k variables in X1,..., Xn which are uniformly distributed and independent of each other, and the remaining n − k variables are fixed. A deterministic bitfixing source extractor is a function E: {0, 1} n → ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
An (n, k)bitfixing source is a distribution X over {0, 1} n such that there is a subset of k variables in X1,..., Xn which are uniformly distributed and independent of each other, and the remaining n − k variables are fixed. A deterministic bitfixing source extractor is a function E: {0, 1} n → {0, 1} m which on an arbitrary (n, k)bitfixing source outputs m bits that are statisticallyclose to uniform. Recently, Kamp and Zuckerman [44th FOCS, 2003] gave a construction of a deterministic bitfixing source extractor that extracts Ω(k2 /n) bits and requires k> √ n. In this paper we give constructions of deterministic bitfixing source extractors that extract (1 − o(1))k bits whenever k> (log n) c for some universal constant c> 0. Thus, our constructions extract almost all the randomness from bitfixing sources and work even when k is small. For k ≫ √ n the extracted bits have statistical distance 2−nΩ(1) from uniform, and for k ≤ √ n the extracted bits have statistical distance k−Ω(1) from uniform. Our technique gives a general method to transform deterministic bitfixing source extractors that extract few bits into extractors which extract almost all the bits.
Improved Randomness Extraction from Two Independent Sources
 In Proc. of 8th RANDOM
, 2004
"... Given two independent weak random sources X, Y , with the same length \ell and minentropies bX,bY whose sum is greater than \ell \Omega(polylog(\ell/\epsilon)), we construct a deterministic twosource extractor (aka "blender") that extracts max(bX,bY) (bX bY\ell4log(1/\epsilon)) bits wh ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
Given two independent weak random sources X, Y , with the same length \ell and minentropies bX,bY whose sum is greater than \ell \Omega(polylog(\ell/\epsilon)), we construct a deterministic twosource extractor (aka "blender") that extracts max(bX,bY) (bX bY\ell4log(1/\epsilon)) bits which are \epsilonclose to uniform. In contrast, best previously published construction [4] extracted at most 2(bX bY\ell2log(1/\epsilon)) bits. Our main technical tool is a construction of a strong twosource extractor that extracts (bX bY\ell2log(1/\epsilon)) bits which are \epsilonclose to being uniform and independent of one of the sources(aka "strong blender"), so that they can later be reused as a seed to a seeded extractor. Our strong twosource extractor construction improves the best previously published construction of such strong blenders [7] by a factor of 2, applies to more sources X and Y , and is considerably simpler than the latter. Our methodology also unifies several of the previous twosource extractor constructions from the literature.
Constructions of NearOptimal Extractors Using PseudoRandom Generators
 Electronic Colloquium on Computational Complexity
, 1998
"... We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction that is simple to describe and analyze, does not use any of the standard techniques used in related results, and improves or subsumes almost all the previous constructions. 1 Introduction Informally defined, an extractor is a function that extracts randomness from a weakly random distribution. Explicit constructions of extractors have several applications and are typically very hard to achieve. In this paper we introduce a new approach to the explicit construction of extractors. Our approach yields a construction that improves most of the known results, and that is optimal for certain parameters. Furthermore, our construction is simple and uses techniques that were never used in this field...
How to get more mileage from randomness extractors
, 2007
"... Let C be a class of distributions over {0, 1}^n. A deterministic randomness extractor for C isa function E: {0, 1}n! {0, 1}m such that for any X in C the distribution E(X) is statisticallyclose to the uniform distribution. A long line of research deals with explicit constructions of such extractors ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
Let C be a class of distributions over {0, 1}^n. A deterministic randomness extractor for C isa function E: {0, 1}n! {0, 1}m such that for any X in C the distribution E(X) is statisticallyclose to the uniform distribution. A long line of research deals with explicit constructions of such extractors for various classes C while trying to maximize m.In this paper we give a general transformation that transforms a deterministic extractor Ethat extracts &quot;few &quot; bits into an extractor E0 that extracts &quot;almost all the bits present in the source distribution&quot;. More precisely, we prove a general theorem saying that if E and C satisfycertain properties, then we can transform E into an extractor E0.Our methods build on (and generalize) a technique of Gabizon, Raz and Shaltiel (FOCS 2004) that present such a transformation for the very restricted class C of &quot;oblivious bitfixing sources&quot;. The high level idea is to find properties of E and C which allow &quot;recycling &quot; the outputof E so that it can be &quot;reused &quot; to operate on the source distribution. An obvious obstacle is that the output of E is correlated with the source distribution.Using our transformation we give an explicit construction of a twosource extractor E:{0, 1}n * {0, 1}n! {0, 1}m such that for every two independent distributions X1 and X2 over{ 0, 1}n with minentropy at least k = (1/2 + ffi)n and ffl < = 2 log 4 n, E(X 1, X2) is fflclose to the uniform distribution on m = 2k Cffi log(1/ffl) bits. This result is optimal except for the preciseconstant Cffi and improves previous results by Chor and Goldreich (SICOMP 1988), Vazirani(Combinatorica 1987) and Dodis et al. (RANDOM 2004).
Three xorlemmas  an exposition
 Electronic Colloquium on Computational Complexity (ECCC
, 1995
"... Abstract. We provide an exposition of three lemmas that relate general properties of distributions over bit strings to the exclusiveor (xor) of values of certain bit locations. The first XORLemma, commonly attributed to Umesh Vazirani (1986), relates the statistical distance of a distribution from ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
Abstract. We provide an exposition of three lemmas that relate general properties of distributions over bit strings to the exclusiveor (xor) of values of certain bit locations. The first XORLemma, commonly attributed to Umesh Vazirani (1986), relates the statistical distance of a distribution from the uniform distribution over bit strings to the maximum bias of the xor of certain bit positions. The second XORLemma, due to Umesh and Vijay Vazirani (19th STOC, 1987), is a computational analogue of the first. It relates the pseudorandomness of a distribution to the difficulty of predicting the xor of bits in particular or random positions. The third Lemma, due to Goldreich and Levin (21st STOC, 1989), relates the difficulty of retrieving a string and the unpredictability of the xor of random bit positions. The most notable XOR Lemma – that is the socalled Yao XOR Lemma – is not discussed here. We focus on the proofs of the aforementioned three lemma. Our exposition deviates from the original proofs, yielding proofs that are believed to be simpler, of wider applicability, and establishing somewhat stronger quantitative results. Credits for these improved proofs are due to several researchers.
On the (im)possibility of cryptography with imperfect randomness
 In Proc. 45th IEEE FOCS
, 2004
"... We investigate the feasibility of a variety of cryptographic tasks with imperfect randomness. The kind of imperfect randomness we consider are entropy sources, such as those considered by Santha and Vazirani, Chor and Goldreich, and Zuckerman. We show the following: Certain cryptographic tasks like ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
We investigate the feasibility of a variety of cryptographic tasks with imperfect randomness. The kind of imperfect randomness we consider are entropy sources, such as those considered by Santha and Vazirani, Chor and Goldreich, and Zuckerman. We show the following: Certain cryptographic tasks like bit commitment, encryption, secret sharing, zeroknowledge, noninteractive zeroknowledge, and secure twoparty computation for any nontrivial function are impossible to realize if parties have access to entropy sources with slightly lessthanperfect entropy, i.e., sources with imperfect randomness. These results are unconditional and do not rely on any unproven assumption. On the other hand, based on stronger variants of standard assumptions, secure signature schemes are possible with imperfect entropy sources. As another positive result, we show (without any unproven assumption) that interactive proofs can be made sound with respect to imperfect entropy sources. 1.
Explicit ordispersers with polylogarithmic degree
 J. ACM
, 1998
"... An (N,M,T)ORdisperser is a bipartite multigraph G = (V,W,E) withV  = N, and W  = M, having the following expansion property: any subset of V having at least T vertices has a neighbor set of size at least M/2. For any pair of constants ξ,λ,1 ≥ ξ>λ ≥ 0, any sufficiently large N, andforany ( ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
An (N,M,T)ORdisperser is a bipartite multigraph G = (V,W,E) withV  = N, and W  = M, having the following expansion property: any subset of V having at least T vertices has a neighbor set of size at least M/2. For any pair of constants ξ,λ,1 ≥ ξ>λ ≥ 0, any sufficiently large N, andforany (log N)ξ (log N)λ T ≥ 2, M ≤ 2, we give an explicit elementary construction of an (N,M,T)ORdisperser such that the outdegree of any vertex in V is at most polylogarithmic in N. Using this with known applications of ORdispersers yields several results. First, our construction implies that the complexity class StrongRP defined by Sipser, equals RP. Second, for any fixed η>0, we give the first polynomialtime simulation of RP algorithms using the output of any “ηminimally random ” source. For any integral R>0, such a source accepts a single request for an Rbit string and generates the string according to a distribution that assigns probability at most 2−Rη to any string. It is minimally random in the sense that any weaker source is
An introduction to randomness extractors
"... Abstract. We give an introduction to the area of “randomness extraction” and survey the main concepts of this area: deterministic extractors, seeded extractors and multiple sources extractors. For each one we briefly discuss background, definitions, explicit constructions and applications. 1 ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Abstract. We give an introduction to the area of “randomness extraction” and survey the main concepts of this area: deterministic extractors, seeded extractors and multiple sources extractors. For each one we briefly discuss background, definitions, explicit constructions and applications. 1
Extracting All the Randomness from a Weakly Random Source
, 1998
"... In this paper, we give two explicit constructions of extractors, both of which work for a source of any minentropy on strings of length n. The first extracts any constant fraction of the minentropy using O(log 2 n) additional random bits. The second extracts all the minentropy using O(log 3 ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
In this paper, we give two explicit constructions of extractors, both of which work for a source of any minentropy on strings of length n. The first extracts any constant fraction of the minentropy using O(log 2 n) additional random bits. The second extracts all the minentropy using O(log 3 n) additional random bits. Both constructions use fewer truly random bits than any previous construction which works for all minentropies and extracts a constant fraction of the minentropy. The extractors are obtained by observing that a weaker notion of "combinatorial design" suffices for the NisanWigderson pseudorandom generator [NW94], which underlies the recent extractor of Trevisan [Tre98]. We give nearoptimal constructions of such "weak designs" which achieve much better parameters than possible with the notion of designs used by NisanWigderson and Trevisan. 1 Introduction Roughly speaking, an extractor is a function which extracts truly random bits from a weakly random source,...
Efficient Reduction among Oblivious Transfer Protocols based on New SelfIntersecting Codes
"... A \Gamma 2 1 \Delta OT 2 (oneoutoftwo Bit Oblivious Transfer) is a technique by which a party S owning two secret bits b 0 ; b 1 , can transfer one of them b c to another party R, who chooses c. This is done in a way that does not release any bias about b ¯ c to R nor any bias about c to S. ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
A \Gamma 2 1 \Delta OT 2 (oneoutoftwo Bit Oblivious Transfer) is a technique by which a party S owning two secret bits b 0 ; b 1 , can transfer one of them b c to another party R, who chooses c. This is done in a way that does not release any bias about b ¯ c to R nor any bias about c to S. One interesting extension of this transfer is the \Gamma 2 1 \Delta OT k 2 (oneoutoftwo String O.T.) in which the two secrets q 0 ; q 1 are elements of GF k (2) instead of bits. A reduction of \Gamma 2 1 \Delta OT k 2 to \Gamma 2 1 \Delta OT 2 presented in [BCR86] uses O(k log 2 3 ) calls to \Gamma 2 1 \Delta OT 2 and thus raises an interesting combinatorial question: how many calls to \Gamma 2 1 \Delta OT 2 are necessary and sufficient to achieve a \Gamma 2 1 \Delta OT k 2 ? In the current paper we answer this question quite precisely. We accomplish this reduction using \Theta(k) calls to \Gamma 2 1 \Delta OT 2 . First, we show by probabilist...