Results 1  10
of
15
Automated Analysis of Cryptographic Protocols Using Murphi
, 1997
"... A methodology is presented for using a generalpurpose state enumeration tool, Murphi, to analyze cryptographic and securityrelated protocols. We illustrate the feasibility of the approach by analyzing the NeedhamSchroeder protocol, finding a known bug in a few seconds of computation time, and anal ..."
Abstract

Cited by 263 (23 self)
 Add to MetaCart
A methodology is presented for using a generalpurpose state enumeration tool, Murphi, to analyze cryptographic and securityrelated protocols. We illustrate the feasibility of the approach by analyzing the NeedhamSchroeder protocol, finding a known bug in a few seconds of computation time, and analyzing variants of Kerberos and the faulty TMN protocol used in another comparative study. The efficiency of Murphi allows us to examine multiple runs of relatively short protocols, giving us the ability to detect replay attacks, or errors resulting from confusion between independent execution of a protocol by independent parties.
A Probabilistic PolyTime Framework for Protocol Analysis
, 1998
"... We develop a framework for analyzing security protocols in which protocol adversaries may be arbitrary probabilistic polynomialtime processes. In this framework, protocols are written in a form of process calculus where security may be expressed in terms of observational equivalence, a standard rel ..."
Abstract

Cited by 114 (7 self)
 Add to MetaCart
We develop a framework for analyzing security protocols in which protocol adversaries may be arbitrary probabilistic polynomialtime processes. In this framework, protocols are written in a form of process calculus where security may be expressed in terms of observational equivalence, a standard relation from programming language theory that involves quantifying over possible environments that might interact with the protocol. Using an asymptotic notion of probabilistic equivalence, we relate observational equivalence to polynomialtime statistical tests and discuss some example protocols to illustrate the potential of this approach.
A probabilistic polynomialtime calculus for analysis of cryptographic protocols
 Electronic Notes in Theoretical Computer Science
, 2001
"... We prove properties of a process calculus that is designed for analyzing security protocols. Our longterm goal is to develop a form of protocol analysis, consistent with standard cryptographic assumptions, that provides a language for expressing probabilistic polynomialtime protocol steps, a spec ..."
Abstract

Cited by 44 (8 self)
 Add to MetaCart
We prove properties of a process calculus that is designed for analyzing security protocols. Our longterm goal is to develop a form of protocol analysis, consistent with standard cryptographic assumptions, that provides a language for expressing probabilistic polynomialtime protocol steps, a specification method based on a compositional form of equivalence, and a logical basis for reasoning about equivalence. The process calculus is a variant of CCS, with bounded replication and probabilistic polynomialtime expressions allowed in messages and boolean tests. To avoid inconsistency between security and nondeterminism, messages are scheduled probabilistically instead of nondeterministically. We prove that evaluation of any process expression halts in probabilistic polynomial time and define a form of asymptotic protocol equivalence that allows security properties to be expressed using observational equivalence, a standard relation from programming language theory that involves quantifying over possible environments that might interact with the protocol. We develop a form of probabilistic bisimulation and use it to establish the soundness of an equational proof system based on observational equivalences. The proof system is illustrated by a formation derivation of the assertion, wellknown in cryptography, that ElGamal encryption’s semantic security is equivalent to the (computational) Decision DiffieHellman assumption. This example demonstrates the power of probabilistic bisimulation and equational reasoning for protocol security.
Probabilistic PolynomialTime Process Calculus and Security Protocol Analysis
 Theoretical Computer Science
, 2006
"... Abstract. We prove properties of a process calculus that is designed for analysing security protocols. Our longterm goal is to develop a form of protocol analysis, consistent with standard cryptographic assumptions, that provides a language for expressing probabilistic polynomialtime protocol step ..."
Abstract

Cited by 36 (3 self)
 Add to MetaCart
Abstract. We prove properties of a process calculus that is designed for analysing security protocols. Our longterm goal is to develop a form of protocol analysis, consistent with standard cryptographic assumptions, that provides a language for expressing probabilistic polynomialtime protocol steps, a specification method based on a compositional form of equivalence, and a logical basis for reasoning about equivalence. The process calculus is a variant of CCS, with bounded replication and probabilistic polynomialtime expressions allowed in messages and boolean tests. To avoid inconsistency between security and nondeterminism, messages are scheduled probabilistically instead of nondeterministically. We prove that evaluation of any process expression halts in probabilistic polynomial time and define a form of asymptotic protocol equivalence that allows security properties to be expressed using observational equivalence, a standard relation from programming language theory that involves quantifying over all possible environments that might interact with the protocol. We develop a form of probabilistic bisimulation and use it to establish the soundness of an equational proof system based on observational equivalences. The proof system is illustrated by a formation derivation of the assertion, wellknown in cryptography, that El Gamal encryption’s semantic security is equivalent to the (computational) Decision DiffieHellman assumption. This example demonstrates the power of probabilistic bisimulation and equational reasoning for protocol security.
Optimistic Fair Secure Computation
 In Advances in Cryptology— CRYPTO ’00
, 2000
"... We present an efficient and fair protocol for secure twoparty computation in the optimistic model, where a partially trusted third party T is available, but not involved in normal protocol executions. T is needed only if communication is disrupted or if one of the two parties misbehaves. The protoc ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
We present an efficient and fair protocol for secure twoparty computation in the optimistic model, where a partially trusted third party T is available, but not involved in normal protocol executions. T is needed only if communication is disrupted or if one of the two parties misbehaves. The protocol guarantees that although one party may terminate the protocol at any time, the computation remains fair for the other party. The two parties are linked by an asynchronous communication network only, but the link between each party and T requires minimal synchrony. All our protocols are based on efficient proofs of knowledge and involve no general zeroknowledge tools. As intermediate steps we describe efficient implementations of verifiable oblivious transfer, escrowed oblivious transfer, and verifiable secure function evaluation, which may be useful in other contexts. The security of all protocols is proved under the decisional DiffieHellman assumption. 1
Sasbased group authentication and key agreement protocols
 In Public Key Cryptography
, 2008
"... Abstract. New trends in consumer electronics have created a strong demand for fast, reliable and userfriendly key agreement protocols. However, many key agreement protocols are secure only against passive attacks. Therefore, message authentication is often unavoidable in order to achieve security a ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
Abstract. New trends in consumer electronics have created a strong demand for fast, reliable and userfriendly key agreement protocols. However, many key agreement protocols are secure only against passive attacks. Therefore, message authentication is often unavoidable in order to achieve security against active adversaries. Pasini and Vaudenay were the first to propose a new compelling methodology for message authentication. Namely, their twoparty protocol uses short authenticated strings (SAS) instead of preshared secrets or publickey infrastructure that are classical tools to achieve authenticity. In this article, we generalise this methodology for multiparty settings. We give a new group message authentication protocol that utilises only limited authenticated communication and show how to combine this protocol with classical key agreement procedures. More precisely, we describe how to transform any group key agreement protocol that is secure against passive attacks into a new protocol that is secure against active attacks.
Semantic security under relatedkey attacks and applications
 Cited on page 4.) 16 M. Bellare. New proofs for NMAC and HMAC: Security without collisionresistance. In C. Dwork, editor, CRYPTO 2006, volume 4117 of LNCS
, 2011
"... In a relatedkey attack (RKA) an adversary attempts to break a cryptographic primitive by invoking the primitive with several secret keys which satisfy some known, or even chosen, relation. We initiate a formal study of RKA security for randomized encryption schemes. We begin by providing general de ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
In a relatedkey attack (RKA) an adversary attempts to break a cryptographic primitive by invoking the primitive with several secret keys which satisfy some known, or even chosen, relation. We initiate a formal study of RKA security for randomized encryption schemes. We begin by providing general definitions for semantic security under passive and active RKAs. We then focus on RKAs in which the keys satisfy known linear relations over some Abelian group. We construct simple and efficient schemes which resist such RKAs even when the adversary can choose the linear relation adaptively during the attack. More concretely, we present two approaches for constructing RKAsecure encryption schemes. The first is based on standard randomized encryption schemes which additionally satisfy a natural “keyhomomorphism” property. We instantiate this approach under numbertheoretic or latticebased assumptions such as the Decisional DiffieHellman (DDH) assumption and the Learning Noisy Linear Equations assumption. Our second approach is based on RKAsecure pseudorandom generators. This approach can yield either deterministic, onetime use schemes with optimal ciphertext size or randomized unlimited use schemes. We instantiate this approach by constructing a simple RKAsecure pseurodandom generator
Algorithmic Techniques in Verification by Explicit State Enumeration
, 1997
"... Modern digital systems often employ sophisticated protocols. Unfortunately, designing correct protocols is a subtle art. Even when using great care, a designer typically cannot foresee all possible interactions among the components of the system; thus, bugs like subtle race conditions or deadlocks a ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
Modern digital systems often employ sophisticated protocols. Unfortunately, designing correct protocols is a subtle art. Even when using great care, a designer typically cannot foresee all possible interactions among the components of the system; thus, bugs like subtle race conditions or deadlocks are easily overlooked. One way a computer can support the designer is by simulating random executions of the system. There is, however, a high probability of missing executions containing errors  especially in complex systems  using this simulation approach. In contrast, an automatic verifier tries to examine all states reachable from a given set of startstates. The biggest obstacle in this exhaustive approach is that often there is a very large number of reachable states. This thesis describes three techniques to increase the size of the reachable state spaces that can be handled in automatic verifiers. The techniques work in verifiers that are based on explicitly storing each reachable ...
Afterthefact leakage in publickey encryption
 TCC 2011, volume 6597 of LNCS
, 2011
"... What does it mean for an encryption scheme to be leakageresilient? Prior formulations require that the scheme remains semantically secure even in the presence of leakage, but only considered leakage that occurs before the challenge ciphertext is generated. Although seemingly necessary, this restric ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
What does it mean for an encryption scheme to be leakageresilient? Prior formulations require that the scheme remains semantically secure even in the presence of leakage, but only considered leakage that occurs before the challenge ciphertext is generated. Although seemingly necessary, this restriction severely limits the usefulness of the resulting notion. In this work we study afterthefact leakage, namely leakage that the adversary obtains after seeing the challenge ciphertext. We seek a “natural ” and realizable notion of security, which is usable in higherlevel protocols and applications. To this end, we formulate entropic leakageresilient PKE. This notion captures the intuition that as long as the entropy of the encrypted message is higher than the amount of leakage, the message still has some (pseudo) entropy left. We show that this notion is realized by the NaorSegev constructions (using hash proof systems). We demonstrate that entropic leakageresilience is useful by showing a simple construction that uses it to get semantic security in the presence of afterthefact leakage, in a model of bounded memory leakage from a split state. 1
Relations among notions of security for identity based encryption schemes. Cryptology ePrint Archive, Report 2005/258
 In Latin American Theoretical Informatics (LATIN ’06), volume 3887 of LNCS
, 2005
"... Abstract. This paper shows that the standard security notion for identity based encryption schemes (IBE), that is INDIDCCA2, captures the essence of security for all IBE schemes. To achieve this intention, we first describe formal definitions of the notions of security for IBE, and then present th ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. This paper shows that the standard security notion for identity based encryption schemes (IBE), that is INDIDCCA2, captures the essence of security for all IBE schemes. To achieve this intention, we first describe formal definitions of the notions of security for IBE, and then present the relations among OW, IND, SS and NM in IBE, along with rigorous proofs. With the aim of comprehensiveness, notions of security for IBE in the context of encryption of multiple messages and/or to multiple receivers are finally presented. All of these results are proposed with the consideration of the particular attack in IBE, namely the adaptive chosen identity attack. 1