Results 1 
6 of
6
Graph Nonisomorphism Has Subexponential Size Proofs Unless The PolynomialTime Hierarchy Collapses
 SIAM Journal on Computing
, 1998
"... We establish hardness versus randomness tradeoffs for a broad class of randomized procedures. In particular, we create efficient nondeterministic simulations of bounded round ArthurMerlin games using a language in exponential time that cannot be decided by polynomial size oracle circuits with acce ..."
Abstract

Cited by 108 (6 self)
 Add to MetaCart
We establish hardness versus randomness tradeoffs for a broad class of randomized procedures. In particular, we create efficient nondeterministic simulations of bounded round ArthurMerlin games using a language in exponential time that cannot be decided by polynomial size oracle circuits with access to satisfiability. We show that every language with a bounded round ArthurMerlin game has subexponential size membership proofs for infinitely many input lengths unless exponential time coincides with the third level of the polynomialtime hierarchy (and hence the polynomialtime hierarchy collapses). This provides the first strong evidence that graph nonisomorphism has subexponential size proofs. We set up a general framework for derandomization which encompasses more than the traditional model of randomized computation. For a randomized procedure to fit within this framework, we only require that for any fixed input the complexity of checking whether the procedure succeeds on a given ...
Informationtheoretic key agreement: From weak to strong secrecy for free
 Lecture Notes in Computer Science
, 2000
"... Abstract. One of the basic problems in cryptography is the generation of a common secret key between two parties, for instance in order to communicate privately. In this paper we consider informationtheoretically secure key agreement. Wyner and subsequently Csiszár and Körner described and analyzed ..."
Abstract

Cited by 58 (2 self)
 Add to MetaCart
(Show Context)
Abstract. One of the basic problems in cryptography is the generation of a common secret key between two parties, for instance in order to communicate privately. In this paper we consider informationtheoretically secure key agreement. Wyner and subsequently Csiszár and Körner described and analyzed settings for secretkey agreement based on noisy communication channels. Maurer as well as Ahlswede and Csiszár generalized these models to a scenario based on correlated randomness and public discussion. In all these settings, the secrecy capacity and the secretkey rate, respectively, have been defined as the maximal achievable rates at which a highlysecret key can be generated by the legitimate partners. However, the privacy requirements were too weak in all these definitions, requiring only the ratio between the adversary’s information and the length of the key to be negligible, but hence tolerating her to obtain a possibly substantial amount of information about the resulting key in an absolute sense. We give natural stronger definitions of secrecy capacity and secretkey rate, requiring that the adversary obtains virtually no information about the entire key. We show that not only secretkey agreement satisfying the strong secrecy condition is possible, but even that the achievable keygeneration rates are equal to the previous weak notions of secrecy capacity and secretkey rate. Hence the unsatisfactory old definitions can be completely replaced by the new ones. We prove these results by a generic reduction of strong to weak key agreement. The reduction makes use of extractors, which allow to keep the required amount of communication negligible as compared to the length of the resulting key.
Constructions of NearOptimal Extractors Using PseudoRandom Generators
 Electronic Colloquium on Computational Complexity
, 1998
"... We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
(Show Context)
We introduce a new approach to construct extractors  combinatorial objects akin to expander graphs that have several applications. Our approach is based on error correcting codes and on the NisanWigderson pseudorandom generator. A straightforward application of our approach yields a construction that is simple to describe and analyze, does not use any of the standard techniques used in related results, and improves or subsumes almost all the previous constructions. 1 Introduction Informally defined, an extractor is a function that extracts randomness from a weakly random distribution. Explicit constructions of extractors have several applications and are typically very hard to achieve. In this paper we introduce a new approach to the explicit construction of extractors. Our approach yields a construction that improves most of the known results, and that is optimal for certain parameters. Furthermore, our construction is simple and uses techniques that were never used in this field...
Sampling Under Adverse Conditions
"... It is shown that a good estimation of the mean value of a boolean function f dened on f0; 1g m can be obtained by choosing a number of sample points D that is polynomial in m even if the boolean function is chosen by a malicious adversary that knows the sample points, provided that f is computa ..."
Abstract
 Add to MetaCart
(Show Context)
It is shown that a good estimation of the mean value of a boolean function f dened on f0; 1g m can be obtained by choosing a number of sample points D that is polynomial in m even if the boolean function is chosen by a malicious adversary that knows the sample points, provided that f is computable by a circuit of size at most D 1=(4+) for any > 0. 1 Introduction Let us suppose that an estimable survey organization A is conducting a poll. A takes great care in choosing randomly a few persons, poses them one question, and from their answers it derives a pretty accurate estimate on how the whole population would have responded. What if a rival organization whose main goal is to ruin the reputation of A has the possibility to know the persons that have been selected and can also change the question? We show that, under reasonable constraints, organization A can still claim the validity of its estimate. 2 Three Scenarios for Sampling In the traditional setting, sampling provi...
The e2random Entropy Harvester and PRNG for Linux
, 2004
"... Many efficient methods of generating “good ” random numbers exist in the literature of mathematics and computer science. One particular method of generating usable randomness is with “extractors”: graphs which will transform “bad ” randomness (i.e. a smaller ratio of entropy/data, or randomness dist ..."
Abstract
 Add to MetaCart
Many efficient methods of generating “good ” random numbers exist in the literature of mathematics and computer science. One particular method of generating usable randomness is with “extractors”: graphs which will transform “bad ” randomness (i.e. a smaller ratio of entropy/data, or randomness distributed poorly) to “good ” randomness (of a provable level of security) by an additional input of only a small number of truly random bits. The current {,u}random PRNG for Linux is not extensible, which prompted work on a new erandom PRNG using these extractors. The work on erandom ledtoanumberofimprovementstotheentropy harvesting methods used by the Linux kernel, as {,u}random and the entropy harvester are inseperable. The new entropy harvester eh2, combined with the erandom PRNG make up the new pseudorandom number generation subsystem, called e2random. This new driver offers greater flexibility and extensibility than the original {,u}random.
Long Term Confidentiality: a Survey
, 2012
"... Sensitive electronic data may be required to remain confidential for long periods of time. Yet encryption under a computationally secure cryptosystem cannot provide a guarantee of long term confidentiality, due to potential advances in computing power or cryptanalysis. Long term confidentiality is e ..."
Abstract
 Add to MetaCart
(Show Context)
Sensitive electronic data may be required to remain confidential for long periods of time. Yet encryption under a computationally secure cryptosystem cannot provide a guarantee of long term confidentiality, due to potential advances in computing power or cryptanalysis. Long term confidentiality is ensured by information theoretically secure ciphers, but at the expense of impractical key agreement and key management. We overview known methods to alleviate these problems, whilst retaining some form of information theoretic security relevant for long term confidentiality.