Results 1  10
of
131
Entity Authentication and Key Distribution
, 1993
"... Entity authentication and key distribution are central cryptographic problems in distributed computing  but up until now, they have lacked even a meaningful definition. One consequence is that incorrect and inefficient protocols have proliferated. This paper provides the first treatment of these p ..."
Abstract

Cited by 469 (12 self)
 Add to MetaCart
Entity authentication and key distribution are central cryptographic problems in distributed computing  but up until now, they have lacked even a meaningful definition. One consequence is that incorrect and inefficient protocols have proliferated. This paper provides the first treatment of these problems in the complexitytheoretic framework of modern cryptography. Addressed in detail are two problems of the symmetric, twoparty setting: mutual authentication and authenticated key exchange. For each we present a definition, protocol, and proof that the protocol meets its goal, assuming the (minimal) assumption of pseudorandom function. When this assumption is appropriately instantiated, the protocols given are practical and efficient.
Security Arguments for Digital Signatures and Blind Signatures
 JOURNAL OF CRYPTOLOGY
, 2000
"... Since the appearance of publickey cryptography in the seminal DiffieHellman paper, many new schemes have been proposed and many have been broken. Thus, the ..."
Abstract

Cited by 283 (35 self)
 Add to MetaCart
Since the appearance of publickey cryptography in the seminal DiffieHellman paper, many new schemes have been proposed and many have been broken. Thus, the
How to Go Beyond the BlackBox Simulation Barrier
 In 42nd FOCS
, 2001
"... The simulation paradigm is central to cryptography. A simulator is an algorithm that tries to simulate the interaction of the adversary with an honest party, without knowing the private input of this honest party. Almost all known simulators use the adversary’s algorithm as a blackbox. We present t ..."
Abstract

Cited by 221 (14 self)
 Add to MetaCart
The simulation paradigm is central to cryptography. A simulator is an algorithm that tries to simulate the interaction of the adversary with an honest party, without knowing the private input of this honest party. Almost all known simulators use the adversary’s algorithm as a blackbox. We present the first constructions of nonblackbox simulators. Using these new nonblackbox techniques we obtain several results that were previously proven to be impossible to obtain using blackbox simulators. Specifically, assuming the existence of collision resistent hash functions, we construct a new zeroknowledge argument system for NP that satisfies the following properties: 1. This system has a constant number of rounds with negligible soundness error. 2. It remains zero knowledge even when composed concurrently n times, where n is the security parameter. Simultaneously obtaining 1 and 2 has been recently proven to be impossible to achieve using blackbox simulators. 3. It is an ArthurMerlin (public coins) protocol. Simultaneously obtaining 1 and 3 was known to be impossible to achieve with a blackbox simulator. 4. It has a simulator that runs in strict polynomial time, rather than in expected polynomial time. All previously known constantround, negligibleerror zeroknowledge arguments utilized expected polynomialtime simulators.
On the Composition of ZeroKnowledge Proof Systems
 SIAM Journal on Computing
, 1990
"... : The wide applicability of zeroknowledge interactive proofs comes from the possibility of using these proofs as subroutines in cryptographic protocols. A basic question concerning this use is whether the (sequential and/or parallel) composition of zeroknowledge protocols is zeroknowledge too. We ..."
Abstract

Cited by 195 (14 self)
 Add to MetaCart
: The wide applicability of zeroknowledge interactive proofs comes from the possibility of using these proofs as subroutines in cryptographic protocols. A basic question concerning this use is whether the (sequential and/or parallel) composition of zeroknowledge protocols is zeroknowledge too. We demonstrate the limitations of the composition of zeroknowledge protocols by proving that the original definition of zeroknowledge is not closed under sequential composition; and that even the strong formulations of zeroknowledge (e.g. blackbox simulation) are not closed under parallel execution. We present lower bounds on the round complexity of zeroknowledge proofs, with significant implications to the parallelization of zeroknowledge protocols. We prove that 3round interactive proofs and constantround ArthurMerlin proofs that are blackbox simulation zeroknowledge exist only for languages in BPP. In particular, it follows that the "parallel versions" of the first interactive proo...
On Defining Proofs of Knowledge
, 1998
"... The notion of a "proof of knowledge," suggested by Gold wasset, Micali and Rackoff, has been used in many works as a tool for the construction of cryptographic protocols and other schemes. Yet the commonly cited formalizations of this notion are unsatisfactory and in particular inadequate for s ..."
Abstract

Cited by 143 (23 self)
 Add to MetaCart
The notion of a "proof of knowledge," suggested by Gold wasset, Micali and Rackoff, has been used in many works as a tool for the construction of cryptographic protocols and other schemes. Yet the commonly cited formalizations of this notion are unsatisfactory and in particular inadequate for some of the applications in which they are used. Consequently,
Universally Composable Commitments
, 2001
"... We propose a new security measure for commitment protocols, called Universally Composable ..."
Abstract

Cited by 142 (8 self)
 Add to MetaCart
We propose a new security measure for commitment protocols, called Universally Composable
Designated Verifier Proofs and Their Applications
, 1996
"... For many proofs of knowledge it is important that only the verifier designated by the confirmer can obtain any conviction of the correctness of the proof. A good example of such a situation is for undeniable signatures, where the confirmer of a signature wants to make sure that only the intended ver ..."
Abstract

Cited by 135 (5 self)
 Add to MetaCart
For many proofs of knowledge it is important that only the verifier designated by the confirmer can obtain any conviction of the correctness of the proof. A good example of such a situation is for undeniable signatures, where the confirmer of a signature wants to make sure that only the intended verifier(s) in fact can be convinced about the validity or invalidity of the signature. Generally, authentication of messages and offtherecord messages are in conflict with each other. We show how, using designation of verifiers, these notions can be combined, allowing authenticated but private conversations to take place. Our solution guarantees that only the specified verifier can be convinced by the proof, even if he shares all his secret information with entities that want to get convinced. Our solution is based on trapdoor commitments [4], allowing the designated verifier to open up commitments in any way he wants. We demonstrate how a trapdoor commitment scheme can be used to constr...
Efficient Concurrent ZeroKnowledge in the Auxiliary String Model
, 2000
"... We show that if any oneway function exists, then 3round concurrent zeroknowledge arguments for all NP problems can be built in a model where a short auxiliary string with a prescribed distribution is available to the players. We also show that a wide range of known efficient proofs of knowledge ..."
Abstract

Cited by 109 (2 self)
 Add to MetaCart
We show that if any oneway function exists, then 3round concurrent zeroknowledge arguments for all NP problems can be built in a model where a short auxiliary string with a prescribed distribution is available to the players. We also show that a wide range of known efficient proofs of knowledge using specialized assumptions can be modified to work in this model with no essential loss of efficiency. We argue that the assumptions of the model will be satisfied in many practical scenarios where public key cryptography is used, in particular our construction works given any secure public key infrastructure. Finally, we point out that in a model with preprocessing (and no auxiliary string) proposed earlier, concurrent zeroknowledge for NP can be based on any oneway function.
Universally Composable Notions of Key Exchange and Secure Channels
, 2002
"... Abstract. Recently, Canetti and Krawczyk (Eurocrypt’2001) formulated a notion of security for keyexchange (ke) protocols, called SKsecurity, and showed that this notion suffices for constructing secure channels. However, their model and proofs do not suffice for proving more general composability p ..."
Abstract

Cited by 104 (7 self)
 Add to MetaCart
Abstract. Recently, Canetti and Krawczyk (Eurocrypt’2001) formulated a notion of security for keyexchange (ke) protocols, called SKsecurity, and showed that this notion suffices for constructing secure channels. However, their model and proofs do not suffice for proving more general composability properties of SKsecure ke protocols. We show that while the notion of SKsecurity is strictly weaker than a fullyidealized notion of key exchange security, it is sufficiently robust for providing secure composition with arbitrary protocols. In particular, SKsecurity guarantees the security of the key for any application that desires to setup secret keys between pairs of parties. We also provide new definitions of securechannels protocols with similarly strong composability properties, and show that SKsecurity suffices for obtaining these definitions. To obtain these results we use the recently proposed framework of “universally composable (UC) security. ” We also use a new tool, called “noninformation oracles, ” which will probably find applications beyond the present case. These tools allow us to bridge between seemingly limited indistinguishabilitybased definitions such as SKsecurity and more powerful, simulationbased definitions, such as UC security, where general composition theorems can be proven. Furthermore, based on such composition theorems we reduce the analysis of a fullfledged multisession keyexchange protocol to the (simpler) analysis of individual, standalone, keyexchange sessions.
BlackBox Concurrent ZeroKnowledge Requires (almost) Logarithmically Many Rounds
 SIAM Journal on Computing
, 2002
"... We show that any concurrent zeroknowledge protocol for a nontrivial language (i.e., for a language outside BPP), whose security is proven via blackbox simulation, must use at least ~ \Omega\Gamma/10 n) rounds of interaction. This result achieves a substantial improvement over previous lower bound ..."
Abstract

Cited by 89 (6 self)
 Add to MetaCart
We show that any concurrent zeroknowledge protocol for a nontrivial language (i.e., for a language outside BPP), whose security is proven via blackbox simulation, must use at least ~ \Omega\Gamma/10 n) rounds of interaction. This result achieves a substantial improvement over previous lower bounds, and is the first bound to rule out the possibility of constantround concurrent zeroknowledge when proven via blackbox simulation. Furthermore, the bound is polynomially related to the number of rounds in the best known concurrent zeroknowledge protocol for languages in NP (which is established via blackbox simulation).