Results 1  10
of
16
Design and Analysis of Practical PublicKey Encryption Schemes Secure against Adaptive Chosen Ciphertext Attack
 SIAM Journal on Computing
, 2001
"... A new public key encryption scheme, along with several variants, is proposed and analyzed. The scheme and its variants are quite practical, and are proved secure against adaptive chosen ciphertext attack under standard intractability assumptions. These appear to be the first publickey encryption sc ..."
Abstract

Cited by 189 (11 self)
 Add to MetaCart
A new public key encryption scheme, along with several variants, is proposed and analyzed. The scheme and its variants are quite practical, and are proved secure against adaptive chosen ciphertext attack under standard intractability assumptions. These appear to be the first publickey encryption schemes in the literature that are simultaneously practical and provably secure.
Universal Hash Proofs and a Paradigm for Adaptive Chosen Ciphertext Secure PublicKey Encryption
, 2001
"... We present several new and fairly practical publickey encryption schemes and prove them secure against adaptive chosen ciphertext attack. One scheme is based on Paillier's Decision Composite Residuosity (DCR) assumption [7], while another is based in the classical Quadratic Residuosity (QR) assu ..."
Abstract

Cited by 139 (7 self)
 Add to MetaCart
We present several new and fairly practical publickey encryption schemes and prove them secure against adaptive chosen ciphertext attack. One scheme is based on Paillier's Decision Composite Residuosity (DCR) assumption [7], while another is based in the classical Quadratic Residuosity (QR) assumption. The analysis is in the standard cryptographic model, i.e., the security of our schemes does not rely on the Random Oracle model. We also introduce the notion of a universal hash proof system. Essentially, this is a special kind of noninteractive zeroknowledge proof system for an NP language. We do not show that universal hash proof systems exist for all NP languages, but we do show how to construct very ecient universal hash proof systems for a general class of grouptheoretic language membership problems. Given an ecient universal hash proof system for a language with certain natural cryptographic indistinguishability properties, we show how to construct an ecient publickey encryption schemes secure against adaptive chosen ciphertext attack in the standard model. Our construction only uses the universal hash proof system as a primitive: no other primitives are required, although even more ecient encryption schemes can be obtained by using hash functions with appropriate collisionresistance properties. We show how to construct ecient universal hash proof systems for languages related to the DCR and QR assumptions. From these we get corresponding publickey encryption schemes that are secure under these assumptions. We also show that the CramerShoup encryption scheme (which up until now was the only practical encryption scheme that could be proved secure against adaptive chosen ciphertext attack under a reasonable assumption, namely, the Decision...
A Proposal for an ISO Standard for Public Key Encryption (version 2.0)
, 2001
"... This document should be viewed less as a first draft of a standard for publickey encryption, and more as a proposal for what such a draft standard should contain. It is hoped that this proposal will serve as a basis for discussion, from which a consensus for a standard may be formed. ..."
Abstract

Cited by 111 (3 self)
 Add to MetaCart
This document should be viewed less as a first draft of a standard for publickey encryption, and more as a proposal for what such a draft standard should contain. It is hoped that this proposal will serve as a basis for discussion, from which a consensus for a standard may be formed.
ExposureResilient Functions and AllOrNothing Transforms
, 2000
"... We study the problem of partial key exposure. Standard cryptographic de nitions and constructions do not guarantee any security even if a tiny fraction of the secret key is compromised. We show how to build cryptographic primitives that remain secure even when an adversary is able to learn almo ..."
Abstract

Cited by 63 (11 self)
 Add to MetaCart
We study the problem of partial key exposure. Standard cryptographic de nitions and constructions do not guarantee any security even if a tiny fraction of the secret key is compromised. We show how to build cryptographic primitives that remain secure even when an adversary is able to learn almost all of the secret key.
Another Look at “Provable Security"
, 2004
"... We give an informal analysis and critique of several typical “provable security” results. In some cases there are intuitive but convincing arguments for rejecting the conclusions suggested by the formal terminology and “proofs,” whereas in other cases the formalism seems to be consistent with common ..."
Abstract

Cited by 59 (12 self)
 Add to MetaCart
We give an informal analysis and critique of several typical “provable security” results. In some cases there are intuitive but convincing arguments for rejecting the conclusions suggested by the formal terminology and “proofs,” whereas in other cases the formalism seems to be consistent with common sense. We discuss the reasons why the search for mathematically convincing theoretical evidence to support the security of publickey systems has been an important theme of researchers. But we argue that the theoremproof paradigm of theoretical mathematics is often of limited relevance here and frequently leads to papers that are confusing and misleading. Because our paper is aimed at the general mathematical public, it is selfcontained and as jargonfree as possible.
Distributed PseudoRandom Functions and KDCs
 ADVANCES IN CRYPTOLOGY: EUROCRYPT '99, VOLUME 1592 OF LECTURE NOTES IN COMPUTER SCIENCE
, 1999
"... This work describes schemes for distributing between n servers the evaluation of a function f which is an approximation to a random function, such that only authorized subsets of servers are able to compute the function. A user who wants to compute f(x) should send x to the members of an authorize ..."
Abstract

Cited by 29 (0 self)
 Add to MetaCart
This work describes schemes for distributing between n servers the evaluation of a function f which is an approximation to a random function, such that only authorized subsets of servers are able to compute the function. A user who wants to compute f(x) should send x to the members of an authorized subset and receive information which enables him to compute f(x). We require that such a scheme is consistent, i.e. that given an input x all authorized subsets compute the same value f(x). The solutions we present enable the operation of many servers, preventing bottlenecks or single points of failure. There are also no single entities which can compromise the security of the entire network. The solutions can be used to distribute the operation of a Key Distribution Center (KDC). They are far better than the known partitioning to domains or replication solutions to this problem, and are especially suited to handle users of multicast groups.
The Foundations of Modern Cryptography
, 1998
"... In our opinion, the Foundations of Cryptography are the paradigms, approaches and techniques used to conceptualize, define and provide solutions to natural cryptographic problems. In this essay, we survey some of these paradigms, approaches and techniques as well as some of the fundamental result ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
In our opinion, the Foundations of Cryptography are the paradigms, approaches and techniques used to conceptualize, define and provide solutions to natural cryptographic problems. In this essay, we survey some of these paradigms, approaches and techniques as well as some of the fundamental results obtained using them. Special effort is made in attempt to dissolve common misconceptions regarding these paradigms and results. c flCopyright 1998 by Oded Goldreich. Permission to make copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that new copies bear this notice and the full citation on the first page. Abstracting with credit is permitted. A preliminary version of this essay has appeared in the proceedings of Crypto97 (Springer's Lecture Notes in Computer Science, Vol. 1294). 0 Contents 1 Introduction 2 I Basic Tools 6 2 Central Paradigms 6 2.1 Computati...
ExposureResilient Cryptography
, 2000
"... We develop the notion of ExposureResilient Cryptography. While standard cryptographic definitions and constructions do not guarantee any security even if a tiny fraction of the secret entity (e.g., cryptographic key) is compromised, the objective of ExposureResilient Cryptography is to build infor ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
We develop the notion of ExposureResilient Cryptography. While standard cryptographic definitions and constructions do not guarantee any security even if a tiny fraction of the secret entity (e.g., cryptographic key) is compromised, the objective of ExposureResilient Cryptography is to build information structures such that almost complete (intentional or unintentional) exposure of such a structure still protects the secret information embedded in this structure. The key to our approach is a new primitive of independent interest, which we call an ExposureResilient Function (ERF)  a deterministic function whose output appears random (in a perfect, statistical or computational sense) even if almost all the bits of the input are known. ERF's by themselves eciently solve the partial exposure of secrets in the setting where the secret is simply a random value, like in the privatekey cryptography. They can also be viewed as very secure pseudorandom generators and have many other applica...
Another look at generic groups
 Advances in Mathematics of Communications
, 2006
"... (Communicated by Andreas Stein) Abstract. Starting with Shoup’s seminal paper [24], the generic group model has been an important tool in reductionist security arguments. After an informal explanation of this model and Shoup’s theorem, we discuss the danger of flaws in proofs. We next describe an on ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Communicated by Andreas Stein) Abstract. Starting with Shoup’s seminal paper [24], the generic group model has been an important tool in reductionist security arguments. After an informal explanation of this model and Shoup’s theorem, we discuss the danger of flaws in proofs. We next describe an ontological difference between the generic group assumption and the random oracle model for hash functions. We then examine some criticisms that have been leveled at the generic group model and raise some questions of our own. 1.
Kresilient identitybased encryption in the standard model
 In Topics in Cryptology CTRSA 2004
, 2004
"... Abstract. We present and analyze an adaptive chosen ciphertext secure (INDCCA) identitybased encryption scheme (IBE) based on the well studied Decisional DiffieHellman (DDH) assumption. The scheme is provably secure in the standard model assuming the adversary can corrupt up to a maximum of k use ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Abstract. We present and analyze an adaptive chosen ciphertext secure (INDCCA) identitybased encryption scheme (IBE) based on the well studied Decisional DiffieHellman (DDH) assumption. The scheme is provably secure in the standard model assuming the adversary can corrupt up to a maximum of k users adaptively. This is contrary to the BonehFranklin scheme which holds in the randomoracle model. Key words: identitybased encryption, standard model 1