Results 1  10
of
26
Foundations of Cryptography (Fragments of a Book)
, 1995
"... this paper date to early 1983. Yet, the paper, being rejected three times from major conferences, has first appeared in public only in 1985, concurrently to the paper of Babai [B85].) A restricted form of interactive proofs, known by the name Arthur Mer'lin Games, was introduced by Babai [B85]. (The ..."
Abstract

Cited by 142 (21 self)
 Add to MetaCart
this paper date to early 1983. Yet, the paper, being rejected three times from major conferences, has first appeared in public only in 1985, concurrently to the paper of Babai [B85].) A restricted form of interactive proofs, known by the name Arthur Mer'lin Games, was introduced by Babai [B85]. (The restricted form turned out to be equivalent in power see Section [mssng(effp.sec)].) The interactive proof for Graph NonIsomorphism is due to Goldreich, Micali and Wigderson The concept of zeroknowledge has been introduced by Goldwasser, Micali and Rackoff, in the same paper quoted above [R85]. Their paper contained also a perfect zeroknowledge proof for Quadratic Non Residuousity. The perfect zeroknowledge proof system for Graph Isomorphism is due to Goldreich, Micali and Wigderson [W86]. The latter paper is also the source to the zeroknowledge proof systems for all languages in 2V72, using any (nonunifomly) oneway function. (Brassard and Crapeau have later' constructed alternative zeroknowledge proof systems for 2V72, using a stronger' intractability assumption, specifically the intractability of the Quadratic Residuousity Problem.) The cryptographic applications of zeroknowledge proofs were the very motivation for their presentation in [R85]. Zeroknowledge proofs were applied to solve cryptographic problems in [FRW85] and [CF85]. However, many more applications were possible once it was shown how to construct zeroknowledge proof systems for every language in In particular, general methodologies for the construction of cryptographic protocols have appeared in [6MW86,GW87]
Efficient Cryptographic Schemes Provably as Secure as Subset Sum
 Journal of Cryptology
, 1993
"... We show very efficient constructions for a pseudorandom generator and for a universal oneway hash function based on the intractability of the subset sum problem for certain dimensions. (Pseudorandom generators can be used for private key encryption and universal oneway hash functions for sign ..."
Abstract

Cited by 78 (8 self)
 Add to MetaCart
We show very efficient constructions for a pseudorandom generator and for a universal oneway hash function based on the intractability of the subset sum problem for certain dimensions. (Pseudorandom generators can be used for private key encryption and universal oneway hash functions for signature schemes). The increase in efficiency in our construction is due to the fact that many bits can be generated/hashed with one application of the assumed oneway function. All our construction can be implemented in NC using an optimal number of processors. Part of this work done while both authors were at UC Berkeley and part when the second author was at the IBM Almaden Research Center. Research supported by NSF grant CCR 88  13632. A preliminary version of this paper appeared in Proc. of the 30th Symp. on Foundations of Computer Science, 1989. 1 Introduction Many cryptosystems are based on the intractability of such number theoretic problems such as factoring and discrete logarit...
A knapsacktype public key cryptosystem based on arithmetic in finite fields
 IEEE Trans. Inform. Theory
, 1988
"... AbstractA new knapsacktype public key cryptosystem is introduced. The system is based on a novel application of arithmetic in finite fields, following a construction by Bose and Chowla. By appropriately choosing the parameters, one can control the density of the resulting knapsack, which is the ra ..."
Abstract

Cited by 40 (0 self)
 Add to MetaCart
AbstractA new knapsacktype public key cryptosystem is introduced. The system is based on a novel application of arithmetic in finite fields, following a construction by Bose and Chowla. By appropriately choosing the parameters, one can control the density of the resulting knapsack, which is the ratio between the number of elements in the knapsack and their sue in bits. In particular, the density can be made high enough to foil “lowdensity ” attacks against our system. At the moment, no attacks capable of “breaking ” this system in a reasonable amount of time are known. I.
Generalized compact knapsacks are collision resistant
 In ICALP (2
, 2006
"... n.A step in the direction of creating efficient cryptographic functions based on worstcase hardness was ..."
Abstract

Cited by 40 (14 self)
 Add to MetaCart
n.A step in the direction of creating efficient cryptographic functions based on worstcase hardness was
Lecture Notes on Cryptography
, 2001
"... This is a set of lecture notes on cryptography compiled for 6.87s, a one week long course on cryptography taught at MIT by Shafi Goldwasser and Mihir Bellare in the summers of 1996–2001. The notes were formed by merging notes written for Shafi Goldwasser’s Cryptography and Cryptanalysis course at MI ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
This is a set of lecture notes on cryptography compiled for 6.87s, a one week long course on cryptography taught at MIT by Shafi Goldwasser and Mihir Bellare in the summers of 1996–2001. The notes were formed by merging notes written for Shafi Goldwasser’s Cryptography and Cryptanalysis course at MIT with notes written for Mihir Bellare’s Cryptography and network security course at UCSD. In addition, Rosario Gennaro (as Teaching Assistant for the course in 1996) contributed Section 9.6, Section 11.4, Section 11.5, and Appendix D to the notes, and also compiled, from various sources, some of the problems in Appendix E. Cryptography is of course a vast subject. The thread followed by these notes is to develop and explain the notion of provable security and its usage for the design of secure protocols. Much of the material in Chapters 2, 3 and 7 is a result of scribe notes, originally taken by MIT graduate students who attended Professor Goldwasser’s Cryptography and Cryptanalysis course over the years, and later edited by Frank D’Ippolito who was a teaching assistant for the course in 1991. Frank also contributed much of the advanced number theoretic material in the Appendix. Some of the material in Chapter 3 is from the chapter on Cryptography, by R. Rivest, in the Handbook of Theoretical Computer Science. Chapters 4, 5, 6, 8 and 10, and Sections 9.5 and 7.4.6, were written by Professor Bellare for his Cryptography and network security course at UCSD.
Random Polynomials and Polynomial Factorization
 TO APPEAR IN AUTOMATA, LANGUAGES AND PROGRAMMING, PROCEEDINGS OF THE 23RD ICALP COLLOQUIUM, PADERBORN, JULY 1996, F. MEYER AUF DER HEIDE, ED.
, 1996
"... We give a precise averagecase analysis of a complete polynomial factorization chain over finite fields by methods based on generating functions and singularity analysis. ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
We give a precise averagecase analysis of a complete polynomial factorization chain over finite fields by methods based on generating functions and singularity analysis.
Practical Security in PublicKey Cryptography
 ICICS 2001, Lecture Notes in Computer Science
, 2002
"... Abstract. Since the appearance of publickey cryptography in DiffieHellman seminal paper, many schemes have been proposed, but many have been broken. Indeed, for many people, the simple fact that a cryptographic algorithm withstands cryptanalytic attacks for several years is considered as a kind of ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Abstract. Since the appearance of publickey cryptography in DiffieHellman seminal paper, many schemes have been proposed, but many have been broken. Indeed, for many people, the simple fact that a cryptographic algorithm withstands cryptanalytic attacks for several years is considered as a kind of validation. But some schemes took a long time before being widely studied, and maybe thereafter being broken. A much more convincing line of research has tried to provide “provable ” security for cryptographic protocols, in a complexity theory sense: if one can break the cryptographic protocol, one can “efficiently ” solve the underlying problem. Unfortunately, very few practical schemes can be proven in this socalled “standard model ” because such a security level rarely meets with efficiency. Moreover, for a long time the security proofs have only been performed in an asymptotic framework, which provides some confidence in the scheme but for very huge parameters only, and thus for unpractical schemes. A recent trend consists in providing very efficient reductions, with a practical meaning: with usual parameters (such as 1024bit RSA moduli) the computational cost of any attack is actually 2 72, given the state of the art about classical problems (e.g. integer factoring). In this paper, we focus on practical schemes together with their “reductionist ” security proofs. We cover the two main goals that publickey cryptography is devoted to solve: authentication with digital signatures and confidentiality with publickey encryption schemes.
Removing Randomness From Computational Number Theory
, 1989
"... In recent years, many probabilistic algorithms (i.e., algorithms that can toss coins) that run in polynomial time have been discovered for problems with no known deterministic polynomial time algorithms. Perhaps the most famous example is the problem of testing large (say, 100 digit) numbers for pri ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In recent years, many probabilistic algorithms (i.e., algorithms that can toss coins) that run in polynomial time have been discovered for problems with no known deterministic polynomial time algorithms. Perhaps the most famous example is the problem of testing large (say, 100 digit) numbers for primality. Even for problems which are known to have deterministic polynomial time algorithms, these algorithms are often not as fast as some probabilistic algorithms for the same problem. Even though probabilistic algorithms are useful in practice, we would like to know, for both theoretical and practical reasons, if randomization is really necessary to obtain the most efficient algorithms for certain problems. That is, we would like to know for which problems there is an inherent gap between the deterministic and probabilistic complexities of these problems. In this research, we consider two problems of a number theoretic nature: factoring polynomials over finite fields and constructing irred...
The Applications of Genetic Algorithms in Cryptanalysis
, 1996
"... This thesis describes a method of deciphering messages encrypted with rotor machines utilising a Genetic Algorithm to search the keyspace. A fitness measure based on the phi test for non randomness of text is described and the results show that an unknown three rotor machine can generally be cryptan ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This thesis describes a method of deciphering messages encrypted with rotor machines utilising a Genetic Algorithm to search the keyspace. A fitness measure based on the phi test for non randomness of text is described and the results show that an unknown three rotor machine can generally be cryptanalysed with about 4000 letters of ciphertext. The results are compared to those given using a previously published technique and found to be superior. Acknowledgements I would like to thank my supervisors, Vic RaywardSmith and Geoff McKeown, for their help and encouragement. Contents 1 Introduction 8 2 Statistical Inference 10 2.1 Introduction : : : : : : : : : : : : : : : : : : : : : : : : : : : : 10 2.2 Uncertainty : : : : : : : : : : : : : : : : : : : : : : : : : : : : 11 2.2.1 Rules of Probability : : : : : : : : : : : : : : : : : : : 12 2.2.2 Frequency Probability : : : : : : : : : : : : : : : : : : 15 2.2.3 Subjective Probability : : : : : : : : : : : : : : : : : : 15 2.3 Modelling...