Results 1  10
of
60
Publickey cryptosystems based on composite degree residuosity classes
 IN ADVANCES IN CRYPTOLOGY — EUROCRYPT 1999
, 1999
"... This paper investigates a novel computational problem, namely the Composite Residuosity Class Problem, and its applications to publickey cryptography. We propose a new trapdoor mechanism and derive from this technique three encryption schemes: a trapdoor permutation and two homomorphic probabilist ..."
Abstract

Cited by 991 (4 self)
 Add to MetaCart
This paper investigates a novel computational problem, namely the Composite Residuosity Class Problem, and its applications to publickey cryptography. We propose a new trapdoor mechanism and derive from this technique three encryption schemes: a trapdoor permutation and two homomorphic probabilistic encryption schemes computationally comparable to RSA. Our cryptosystems, based on usual modular arithmetics, are provably secure under appropriate assumptions in the standard model.
Numbertheoretic constructions of efficient pseudorandom functions
 In 38th Annual Symposium on Foundations of Computer Science
, 1997
"... ..."
RSABased Undeniable Signatures
"... We present the first undeniable signatures scheme based on RSA. Since their introduction in 1989 a significant amount of work has been devoted to the investigation of undeniable signatures. So far, this work has been based on discrete log systems. In contrast, our scheme uses regular RSA signature ..."
Abstract

Cited by 78 (5 self)
 Add to MetaCart
We present the first undeniable signatures scheme based on RSA. Since their introduction in 1989 a significant amount of work has been devoted to the investigation of undeniable signatures. So far, this work has been based on discrete log systems. In contrast, our scheme uses regular RSA signatures to generate undeniable signatures. In this new setting, both the signature and verification exponents of RSA are kept secret by the signer, while the public key consists of a composite modulus and a sample RSA signature on a single public message. Our scheme possesses several attractive properties. First of all, provable security, as forging the undeniable signatures is as hard as forging regular RSA signatures. Second, both the confirmation and denial protocols are zeroknowledge. In addition, these protocols are efficient (particularly, the confirmation protocol involves only two rounds of communication and a small number of exponentiations). Furthermore the RSAbased structure of our scheme provides with simple and elegant solutions to add several of the more advanced properties of undeniable signatures found in the literature, including convertibility of the undeniable signatures (into publicly verifiable ones), the possibility to delegate the ability to confirm and deny signatures to a third party without giving up the power to sign, and the existence of distributed (threshold) versions of the signing and confirmation operations. Due to the above properties and the fact that our undeniable signatures are identical in form to standard RSA signatures, the scheme we present becomes a very attractive candidate for practical implementations.
Towards the Equivalence of Breaking the DiffieHellman Protocol and Computing Discrete Logarithms
, 1994
"... Let G be an arbitrary cyclic group with generator g and order jGj with known factorization. G could be the subgroup generated by g within a larger group H. Based on an assumption about the existence of smooth numbers in short intervals, we prove that breaking the DiffieHellman protocol for G and ..."
Abstract

Cited by 78 (6 self)
 Add to MetaCart
Let G be an arbitrary cyclic group with generator g and order jGj with known factorization. G could be the subgroup generated by g within a larger group H. Based on an assumption about the existence of smooth numbers in short intervals, we prove that breaking the DiffieHellman protocol for G and base g is equivalent to computing discrete logarithms in G to the base g when a certain side information string S of length 2 log jGj is given, where S depends only on jGj but not on the definition of G and appears to be of no help for computing discrete logarithms in G. If every prime factor p of jGj is such that one of a list of expressions in p, including p \Gamma 1 and p + 1, is smooth for an appropriate smoothness bound, then S can efficiently be constructed and therefore breaking the DiffieHellman protocol is equivalent to computing discrete logarithms.
Quantum cryptanalysis of hidden linear functions
 in Proceedings of Crypto’95, Lecture Notes in Comput. Sci. 963
, 1995
"... Abstract. Recently there has been a great deal of interest in the power of \Quantum Computers " [4, 15, 18]. The driving force is the recent beautiful result of Shor that shows that discrete log and factoring are solvable in random quantum polynomial time [15]. We use a method similar to Shor&a ..."
Abstract

Cited by 75 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Recently there has been a great deal of interest in the power of \Quantum Computers " [4, 15, 18]. The driving force is the recent beautiful result of Shor that shows that discrete log and factoring are solvable in random quantum polynomial time [15]. We use a method similar to Shor's to obtain a general theorem about quantum polynomial time. We show that any cryptosystem based on what we refer to as a `hidden linear form ' can be broken in quantum polynomial time. Our results imply that the discrete log problem is doable in quantum polynomial time over any group including Galois elds and elliptic curves. Finally, we introduce the notion of `junk bits ' which are helpful when performing classical computations that are not injective. 1
On Interpolation and Automatization for Frege Systems
, 2000
"... The interpolation method has been one of the main tools for proving lower bounds for propositional proof systems. Loosely speaking, if one can prove that a particular proof system has the feasible interpolation property, then a generic reduction can (usually) be applied to prove lower bounds for the ..."
Abstract

Cited by 52 (8 self)
 Add to MetaCart
The interpolation method has been one of the main tools for proving lower bounds for propositional proof systems. Loosely speaking, if one can prove that a particular proof system has the feasible interpolation property, then a generic reduction can (usually) be applied to prove lower bounds for the proof system, sometimes assuming a (usually modest) complexitytheoretic assumption. In this paper, we show that this method cannot be used to obtain lower bounds for Frege systems, or even for TC 0 Frege systems. More specifically, we show that unless factoring (of Blum integers) is feasible, neither Frege nor TC 0 Frege has the feasible interpolation property. In order to carry out our argument, we show how to carry out proofs of many elementary axioms/theorems of arithmetic in polynomial size TC 0 Frege. As a corollary, we obtain that TC 0 Frege as well as any proof system that polynomially simulates it, is not automatizable (under the assumption that factoring of Blum integ...
The Relationship Between Breaking the DiffieHellman Protocol and Computing Discrete Logarithms
, 1998
"... Both uniform and nonuniform results concerning the security of the DiffieHellman keyexchange protocol are proved. First, it is shown that in a cyclic group G of order jGj = Q p e i i , where all the multiple prime factors of jGj are polynomial in log jGj, there exists an algorithm that re ..."
Abstract

Cited by 49 (3 self)
 Add to MetaCart
Both uniform and nonuniform results concerning the security of the DiffieHellman keyexchange protocol are proved. First, it is shown that in a cyclic group G of order jGj = Q p e i i , where all the multiple prime factors of jGj are polynomial in log jGj, there exists an algorithm that reduces the computation of discrete logarithms in G to breaking the DiffieHellman protocol in G and has complexity p maxf(p i )g \Delta (log jGj) O(1) , where (p) stands for the minimum of the set of largest prime factors of all the numbers d in the interval [p \Gamma 2 p p+1; p+2 p p+ 1]. Under the unproven but plausible assumption that (p) is polynomial in log p, this reduction implies that the DiffieHellman problem and the discrete logarithm problem are polynomialtime equivalent in G. Second, it is proved that the DiffieHellman problem and the discrete logarithm problem are equivalent in a uniform sense for groups whose orders belong to certain classes: there exists a p...
Synthesizers and Their Application to the Parallel Construction of PseudoRandom Functions
, 1995
"... A pseudorandom function is a fundamental cryptographic primitive that is essential for encryption, identification and authentication. We present a new cryptographic primitive called pseudorandom synthesizer and show how to use it in order to get a parallel construction of a pseudorandom function. ..."
Abstract

Cited by 48 (10 self)
 Add to MetaCart
A pseudorandom function is a fundamental cryptographic primitive that is essential for encryption, identification and authentication. We present a new cryptographic primitive called pseudorandom synthesizer and show how to use it in order to get a parallel construction of a pseudorandom function. We show several NC¹ implementations of synthesizers based on concrete intractability assumptions as factoring and the DiffieHellman assumption. This yields the first parallel pseudorandom functions (based on standard intractability assumptions) and the only alternative to the original construction of Goldreich, Goldwasser and Micali. In addition, we show parallel constructions of synthesizers based on other primitives such as weak pseudorandom functions or trapdoor oneway permutations. The security of all our constructions is similar to the security of the underlying assumptions. The connection with problems in Computational Learning Theory is discussed.
Cascade Ciphers: The Importance of Being First
, 1993
"... The security of cascade ciphers, in which by definition the keys of the component ciphers are independent, is considered. It is shown by a counterexample that the intuitive result, formally stated and proved in the literature, that a cascade is at least as strong as the strongest component cipher, ..."
Abstract

Cited by 31 (3 self)
 Add to MetaCart
The security of cascade ciphers, in which by definition the keys of the component ciphers are independent, is considered. It is shown by a counterexample that the intuitive result, formally stated and proved in the literature, that a cascade is at least as strong as the strongest component cipher, requires the uninterestingly restrictive assumption that the enemy cannot exploit information about the plaintext statistics. It is proved, for very general notions of breaking a cipher and of problem difficulty, that a cascade is at least as difficult to break as the first component cipher. A consequence of this result is that, if the ciphers commute, then a cascade is at least as difficult to break as the mostdifficulttobreak component cipher, i.e., the intuition that a cryptographic chain is at least as strong as its strongest link is then provably correct. It is noted that additive stream ciphers do commute, and this fact is used to suggest a strategy for designing secure practical ci...
Nonautomatizability of boundeddepth Frege proofs
, 1999
"... In this paper, we show how to extend the argument due to Bonet, Pitassi and Raz to show that boundeddepth Frege proofs do not have feasible interpolation, assuming that factoring of Blum integers or computing the DiffieHellman function is sufficiently hard. It follows as a corollary that boundedde ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
In this paper, we show how to extend the argument due to Bonet, Pitassi and Raz to show that boundeddepth Frege proofs do not have feasible interpolation, assuming that factoring of Blum integers or computing the DiffieHellman function is sufficiently hard. It follows as a corollary that boundeddepth Frege is not automatizable; in other words, there is no deterministic polynomialtime algorithm that will output a short proof if one exists. A notable feature of our argument is its simplicity.