Results 1  10
of
17
Automated Analysis of Cryptographic Protocols Using Murphi
, 1997
"... A methodology is presented for using a generalpurpose state enumeration tool, Murphi, to analyze cryptographic and securityrelated protocols. We illustrate the feasibility of the approach by analyzing the NeedhamSchroeder protocol, finding a known bug in a few seconds of computation time, and anal ..."
Abstract

Cited by 262 (22 self)
 Add to MetaCart
A methodology is presented for using a generalpurpose state enumeration tool, Murphi, to analyze cryptographic and securityrelated protocols. We illustrate the feasibility of the approach by analyzing the NeedhamSchroeder protocol, finding a known bug in a few seconds of computation time, and analyzing variants of Kerberos and the faulty TMN protocol used in another comparative study. The efficiency of Murphi allows us to examine multiple runs of relatively short protocols, giving us the ability to detect replay attacks, or errors resulting from confusion between independent execution of a protocol by independent parties.
Cmc: A pragmatic approach to model checking real code
 In Proceedings of the Fifth Symposium on Operating Systems Design and Implementation
, 2002
"... Permission is granted for noncommercial reproduction of the work for educational or research purposes. ..."
Abstract

Cited by 182 (11 self)
 Add to MetaCart
Permission is granted for noncommercial reproduction of the work for educational or research purposes.
The Murφ Verification System
 IN COMPUTER AIDED VERIFICATION. 8TH INTERNATIONAL CONFERENCE
, 1996
"... This is a brief overview of the Murφ verification system. ..."
Abstract

Cited by 136 (8 self)
 Add to MetaCart
This is a brief overview of the Murφ verification system.
An Analysis of Bitstate Hashing
, 1995
"... The bitstate hashing, or supertrace, technique was introduced in 1987 as a method to increase the quality of verification by reachability analyses for applications that defeat analysis by traditional means because of their size. Since then, the technique has been included in many research verificati ..."
Abstract

Cited by 82 (3 self)
 Add to MetaCart
The bitstate hashing, or supertrace, technique was introduced in 1987 as a method to increase the quality of verification by reachability analyses for applications that defeat analysis by traditional means because of their size. Since then, the technique has been included in many research verification tools, and was adopted in tools that are marketed commercially. It is therefore important that we understand well how and why the method works, what its limitations are, and how it compares with alternative methods over a broad range of problem sizes. The original
Parallelizing the Murφ verifier
 Computer Aided Verification. 9th International Conference
, 1997
"... With the use of state and memory reduction techniques in verification by explicit state enumeration, runtime becomes a major limiting factor. We describe a parallel version of the explicit state enumeration verifier Murφ for distributed memory multiprocessors and networks of workstations that is ba ..."
Abstract

Cited by 55 (0 self)
 Add to MetaCart
With the use of state and memory reduction techniques in verification by explicit state enumeration, runtime becomes a major limiting factor. We describe a parallel version of the explicit state enumeration verifier Murφ for distributed memory multiprocessors and networks of workstations that is based on the message passing paradigm. In experiments with three complex cache coherence protocols, parallel Murφ shows close to linear speedups, which are largely insensitive to communication latency and bandwidth. There is some slowdown with increasing communication overhead, for which a simple yet relatively accurate approximation formula is given. Techniques to reduce overhead and required bandwidth and to allow heterogeneity and dynamically changing load in the parallel machine are discussed, which we expect will allow good speedups when using conventional networks of workstations.
Using Magnetic Disk instead of Main Memory in the Mur phi Verifier
, 1998
"... In verification by explicit state enumeration a randomly accessed state table is maintained. In practice, the total main memory available for this state table is a major limiting factor in verification. We describe a version of the explicit state enumeration verifier Mur' that allows using magnet ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
In verification by explicit state enumeration a randomly accessed state table is maintained. In practice, the total main memory available for this state table is a major limiting factor in verification. We describe a version of the explicit state enumeration verifier Mur' that allows using magnetic disk instead of main memory for storing almost all of the state table. The algorithm avoids costly random accesses to disk and amortizes the cost of linearly reading the state table from disk over all states in a certain breadthfirst level. The remaining runtime overhead for accessing the disk can be strongly reduced by combining the scheme with hash compaction. We show how to do this combination efficiently and analyze the resulting algorithm. In experiments with three complex cache coherence protocols, the new algorithm achieves memory savings factors of one to two orders of magnitude with a runtime overhead of typically only around 15%. Keywords protocol verification, expli...
Fast and Accurate Bitstate Verification for SPIN
 In Proceedings of the 11th International SPIN Workshop on Model Checking of Software (SPIN
, 2004
"... Bitstate hashing in SPIN has proved invaluable in probabilistically detecting errors in large models, but in many cases, the number of omitted states is much higher than it would be if SPIN allowed more than two hash functions to be used. For example, adding just one more hash function can reduce th ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
Bitstate hashing in SPIN has proved invaluable in probabilistically detecting errors in large models, but in many cases, the number of omitted states is much higher than it would be if SPIN allowed more than two hash functions to be used. For example, adding just one more hash function can reduce the probability of omitting states at all from 99% to under 3%. Because hash computation accounts for an overwhelming portion of the total execution cost of bitstate verification with SPIN, adding additional independent hash functions would slow down the process tremendously. We present efficient ways of computing multiple hash values that, despite sacrificing independence, give virtually the same accuracy and even yield a speed improvement in the two hash function case when compared to the current SPIN implementation. Another key to accurate bitstate hashing is utilizing as much memory as is available. The current SPIN implementation is limited to only 512MB and allows only poweroftwo granularity (256MB, 128MB, etc). However, using 768MB instead of 512MB could reduce the probability of a single omission from 20% to less than one chance in 10,000, which demonstrates the magnitude of both the maximum and the granularity limitation. We have modified SPIN to utilize any addressable amount of memory and use any number of efficientlycomputed hash functions, and we present empirical results from extensive experimentation comparing various configurations of our modified version to the original SPIN.
Bloom Filters in Probabilistic Verification
 In Proceedings of the 5th International Conference on Formal Methods in ComputerAided Design (FMCAD
, 2004
"... ..."
Algorithmic Techniques in Verification by Explicit State Enumeration
, 1997
"... Modern digital systems often employ sophisticated protocols. Unfortunately, designing correct protocols is a subtle art. Even when using great care, a designer typically cannot foresee all possible interactions among the components of the system; thus, bugs like subtle race conditions or deadlocks a ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Modern digital systems often employ sophisticated protocols. Unfortunately, designing correct protocols is a subtle art. Even when using great care, a designer typically cannot foresee all possible interactions among the components of the system; thus, bugs like subtle race conditions or deadlocks are easily overlooked. One way a computer can support the designer is by simulating random executions of the system. There is, however, a high probability of missing executions containing errors  especially in complex systems  using this simulation approach. In contrast, an automatic verifier tries to examine all states reachable from a given set of startstates. The biggest obstacle in this exhaustive approach is that often there is a very large number of reachable states. This thesis describes three techniques to increase the size of the reachable state spaces that can be handled in automatic verifiers. The techniques work in verifiers that are based on explicitly storing each reachable ...
Guided model checking with a bayesian metaheuristic
 Fundam. Inform
, 2006
"... This paper presents a formal verification algorithm for finding errors in models of concurrent systems. The algorithm improves explicit guided model checking by applying the empirical Bayes method to revise heuristic estimates of the distance from a given state to an error state. Guided search using ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
This paper presents a formal verification algorithm for finding errors in models of concurrent systems. The algorithm improves explicit guided model checking by applying the empirical Bayes method to revise heuristic estimates of the distance from a given state to an error state. Guided search using the revised estimates finds errors with less search effort than the original estimates. 1.