Results 11  20
of
225
Dispersers, Deterministic Amplification, and Weak Random Sources.
, 1989
"... We use a certain type of expanding bipartite graphs, called disperser graphs, to design procedures for picking highly correlated samples from a finite set, with the property that the probability of hitting any sufficiently large subset is high. These procedures require a relatively small number of r ..."
Abstract

Cited by 94 (12 self)
 Add to MetaCart
We use a certain type of expanding bipartite graphs, called disperser graphs, to design procedures for picking highly correlated samples from a finite set, with the property that the probability of hitting any sufficiently large subset is high. These procedures require a relatively small number of random bits and are robust with respect to the quality of the random bits. Using these sampling procedures to sample random inputs of polynomial time probabilistic algorithms, we can simulate the performance of some probabilistic algorithms with less random bits or with low quality random bits. We obtain the following results: 1. The error probability of an RP or BPP algorithm that operates with a constant error bound and requires n random bits, can be made exponentially small (i.e. 2 \Gamman ), with only (3 + ffl)n random bits, as opposed to standard amplification techniques that require \Omega\Gamma n 2 ) random bits for the same task. This result is nearly optimal, since the informati...
Metropolized Independent Sampling with Comparisons to Rejection Sampling and Importance Sampling
, 1996
"... this paper, a special MetropolisHastings type algorithm, Metropolized independent sampling, proposed firstly in Hastings (1970), is studied in full detail. The eigenvalues and eigenvectors of the corresponding Markov chain, as well as a sharp bound for the total variation distance between the nth ..."
Abstract

Cited by 93 (3 self)
 Add to MetaCart
this paper, a special MetropolisHastings type algorithm, Metropolized independent sampling, proposed firstly in Hastings (1970), is studied in full detail. The eigenvalues and eigenvectors of the corresponding Markov chain, as well as a sharp bound for the total variation distance between the nth updated distribution and the target distribution, are provided. Furthermore, the relationship between this scheme, rejection sampling, and importance sampling are studied with emphasizes on their relative efficiencies. It is shown that Metropolized independent sampling is superior to rejection sampling in two aspects: asymptotic efficiency and ease of computation. Key Words: Coupling, Delta method, Eigen analysis, Importance ratio. 1 1 Introduction
Extractors and Pseudorandom Generators
 Journal of the ACM
, 1999
"... We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain. ..."
Abstract

Cited by 93 (5 self)
 Add to MetaCart
We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain.
Extracting randomness from samplable distributions
 In Proceedings of the 41st Annual IEEE Symposium on Foundations of Computer Science
, 2000
"... The standard notion of a randomness extractor is a procedure which converts any weak source of randomness into an almost uniform distribution. The conversion necessarily uses a small amount of pure randomness, which can be eliminated by complete enumeration in some, but not all, applications. Here, ..."
Abstract

Cited by 58 (8 self)
 Add to MetaCart
The standard notion of a randomness extractor is a procedure which converts any weak source of randomness into an almost uniform distribution. The conversion necessarily uses a small amount of pure randomness, which can be eliminated by complete enumeration in some, but not all, applications. Here, we consider the problem of deterministically converting a weak source of randomness into an almost uniform distribution. Previously, deterministic extraction procedures were known only for sources satisfying strong independence requirements. In this paper, we look at sources which are samplable, i.e. can be generated by an efficient sampling algorithm. We seek an efficient deterministic procedure that, given a sample from any samplable distribution of sufficiently large minentropy, gives an almost uniformly distributed output. We explore the conditions under which such deterministic extractors exist. We observe that no deterministic extractor exists if the sampler is allowed to use more computational resources than the extractor. On the other hand, if the extractor is allowed (polynomially) more resources than the sampler, we show that deterministic extraction becomes possible. This is true unconditionally in the nonuniform setting (i.e., when the extractor can be computed by a small circuit), and (necessarily) relies on complexity assumptions in the uniform setting. One of our uniform constructions is as follows: assuming that there are problems in���ÌÁÅ�ÇÒthat are not solvable by subexponentialsize circuits with¦� gates, there is an efficient extractor that transforms any samplable distribution of lengthÒand minentropy Ò into an output distribution of length ÇÒ, whereis any sufficiently small constant. The running time of the extractor is polynomial inÒand the circuit complexity of the sampler. These extractors are based on a connection be
Extracting randomness using few independent sources
 In Proceedings of the 45th Annual IEEE Symposium on Foundations of Computer Science
, 2004
"... In this work we give the first deterministic extractors from a constant number of weak sources whose entropy rate is less than 1/2. Specifically, for every δ> 0 we give an explicit construction for extracting randomness from a constant (depending polynomially on 1/δ) number of distributions over ..."
Abstract

Cited by 49 (6 self)
 Add to MetaCart
In this work we give the first deterministic extractors from a constant number of weak sources whose entropy rate is less than 1/2. Specifically, for every δ> 0 we give an explicit construction for extracting randomness from a constant (depending polynomially on 1/δ) number of distributions over {0, 1} n, each having minentropy δn. These extractors output n bits, which are 2 −n close to uniform. This construction uses several results from additive number theory, and in particular a recent one by Bourgain, Katz and Tao [BKT03] and of Konyagin [Kon03]. We also consider the related problem of constructing randomness dispersers. For any constant output length m, our dispersers use a constant number of identical distributions, each with minentropy Ω(log n) and outputs every possible mbit string with positive probability. The main tool we use is a variant of the “steppingup lemma ” used in establishing lower bound
Simulating Independence: New Constructions of Condensers, Ramsey Graphs, Dispersers, and Extractors
 In Proceedings of the 37th Annual ACM Symposium on Theory of Computing
, 2005
"... We present new explicit constructions of deterministic randomness extractors, dispersers and related objects. More precisely, a distribution X over binary strings of length n is called a δsource if it assigns probability at most 2 −δn to any string of length n, and for any δ> 0 we construct the ..."
Abstract

Cited by 43 (13 self)
 Add to MetaCart
We present new explicit constructions of deterministic randomness extractors, dispersers and related objects. More precisely, a distribution X over binary strings of length n is called a δsource if it assigns probability at most 2 −δn to any string of length n, and for any δ> 0 we construct the following poly(n)time computable functions: 2source disperser: D: ({0, 1} n) 2 → {0, 1} such that for any two independent δsources X1, X2 we have that the support of D(X1, X2) is {0, 1}. Bipartite Ramsey graph: Let N = 2 n. A corollary is that the function D is a 2coloring of the edges of KN,N (the complete bipartite graph over two sets of N vertices) such that any induced subgraph of size N δ by N δ is not monochromatic. 3source extractor: E: ({0, 1} n) 2 → {0, 1} such that for any three independent δsources X1, X2, X3 we have that E(X1, X2, X3) is (o(1)close to being) an unbiased random bit. No previous explicit construction was known for either of these, for any δ < 1/2 and these results constitute major progress to longstanding open problems. A component in these results is a new construction of condensers that may be of independent
Extracting Randomness via Repeated Condensing
 In Proceedings of the 41st Annual IEEE Symposium on Foundations of Computer Science
, 2000
"... On an input probability distribution with some (min)entropy an extractor outputs a distribution with a (near) maximum entropy rate (namely the uniform distribution). A natural weakening of this concept is a condenser, whose output distribution has a higher entropy rate than the input distribution ( ..."
Abstract

Cited by 43 (16 self)
 Add to MetaCart
On an input probability distribution with some (min)entropy an extractor outputs a distribution with a (near) maximum entropy rate (namely the uniform distribution). A natural weakening of this concept is a condenser, whose output distribution has a higher entropy rate than the input distribution (without losing much of the initial entropy). In this paper we construct efficient explicit condensers. The condenser constructions combine (variants or more efficient versions of) ideas from several works, including the block extraction scheme of [NZ96], the observation made in [SZ94, NT99] that a failure of the block extraction scheme is also useful, the recursive "winwin" case analysis of [ISW99, ISW00], and the error correction of random sources used in [Tre99]. As a natural byproduct, (via repeated iterating of condensers), we obtain new extractor constructions. The new extractors give significant qualitative improvements over previous ones for sources of arbitrary minentropy; they...
Extractors for a constant number of polynomially small minentropy independent sources
 In Proceedings of the 38th Annual ACM Symposium on Theory of Computing
, 2006
"... We consider the problem of randomness extraction from independent sources. We construct an extractor that can extract from a constant number of independent sources of length n, each of which have minentropy n γ for an arbitrarily small constant γ> 0. Our extractor is obtained by composing seeded ..."
Abstract

Cited by 39 (10 self)
 Add to MetaCart
We consider the problem of randomness extraction from independent sources. We construct an extractor that can extract from a constant number of independent sources of length n, each of which have minentropy n γ for an arbitrarily small constant γ> 0. Our extractor is obtained by composing seeded extractors in simple ways. We introduce a new technique to condense independent somewhererandom sources which looks like a useful way to manipulate independent sources. Our techniques are different from those used in recent work [BIW04, BKS + 05, Raz05, Bou05] for this problem in the sense that they do not rely on any results from additive number theory. Using Bourgain’s extractor [Bou05] as a black box, we obtain a new extractor for 2 independent blocksources with few blocks, even when the minentropy is as small as polylog(n). We also show how to modify the 2 source disperser for linear minentropy of Barak et al. [BKS + 05] and the 3 source extractor of Raz [Raz05] to get dispersers/extractors with exponentially small error and linear output length where previously both were constant. In terms of Ramsey Hypergraphs, for every constant 1> γ> 0 our construction gives a family of explicit O(1/γ)uniform hypergraphs on N vertices that avoid cliques and independent sets of (log N)γ size 2.
Formal Verification of Probabilistic Algorithms
, 2002
"... This thesis shows how probabilistic algorithms can be formally verified using a mechanical theorem prover. We begin with an extensive foundational development of probability, creating a higherorder logic formalization of mathematical measure theory. This allows the definition of the probability spac ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
This thesis shows how probabilistic algorithms can be formally verified using a mechanical theorem prover. We begin with an extensive foundational development of probability, creating a higherorder logic formalization of mathematical measure theory. This allows the definition of the probability space we use to model a random bit generator, which informally is a stream of coinflips, or technically an infinite sequence of IID Bernoulli ( 1) random variables. 2 Probabilistic programs are modelled using the statetransformer monad familiar from functional programming, where the random bit generator is passed around in the computation. Functions remove random bits from the generator to perform their calculation, and then pass back the changed random bit generator with the result. Our probability space modelling the random bit generator allows us to give precise probabilistic specifications of such programs, and then verify them in the theorem prover. We also develop technical support designed to expedite verification: probabilistic quantifiers; a compositional property subsuming measurability and independence; a probabilistic while loop together with a formal concept of termination with probability 1. We also introduce a technique for reducing properties of a probabilistic while loop to properties of programs that are guaranteed to terminate: these can then be established using induction and standard methods of program correctness. We demonstrate the formal framework with some example probabilistic programs: sampling algorithms for four probability distributions; some optimal procedures for generating dice rolls from coin flips; the symmetric simple random walk. In addition, we verify the MillerRabin primality test, a wellknown and commercially used probabilistic algorithm. Our fundamental perspective allows us to define a version with strong properties, which we can execute in the logic to prove compositeness of numbers. 3 4
True Random Number Generators Secure in a Changing Environment
 In Workshop on Cryptographic Hardware and Embedded Systems (CHES
, 2003
"... A true random number generator (TRNG) usually consists of two components: an "unpredictable" source with high entropy, and a randomness extractor  a function which, when applied to the source, produces a result that is statistically close to the uniform distribution. ..."
Abstract

Cited by 34 (4 self)
 Add to MetaCart
A true random number generator (TRNG) usually consists of two components: an "unpredictable" source with high entropy, and a randomness extractor  a function which, when applied to the source, produces a result that is statistically close to the uniform distribution.